PDF Summary:Like, Comment, Subscribe, by

Book Summary: Learn the key points in minutes.

Below is a preview of the Shortform book summary of Like, Comment, Subscribe by Mark Bergen. Read the full comprehensive summary at Shortform.

1-Page PDF Summary of Like, Comment, Subscribe

What started as a simple platform for sharing homemade videos has evolved into a global media giant. Like, Comment, Subscribe by Mark Bergen takes readers behind YouTube's meteoric rise, detailing its early days as an upstart startup, its eventual acquisition by Google, and its transformation into a powerhouse that has captivated billions of viewers.

The book delves into YouTube's complex relationships with its creators, advertisers, and critics. It explores YouTube's growing pains, from establishing a lucrative business model to tackling issues like moderation, misinformation, and child safety. Bergen examines the platform's influence on society and politics, including its role in events such as the Arab Spring protests and the 2016 U.S. presidential election.

(continued)...

  • The algorithm change may have inadvertently reduced the diversity of content on the platform, as creators moved away from short, viral formats to align with the new priorities.
  • This shift also impacted the way creators structured their content, often leading to more in-depth storytelling or gameplay analysis to maintain viewer engagement.
Pressure on Content Creators to Produce High-Quality, Advertiser-Friendly Content

The author examines how YouTube’s algorithms drove creators to favor certain types of videos, particularly longer fare with more commercials. This shift, which was part of YouTube's goal to become the internet's television service, had several unintended consequences. Creators began producing "safe" YouTube videos, mirroring the commercial sensibility of daytime TV. Many felt pressure to post more frequently or abandon their artfulness for the algorithmic logic of the "low-quality video classifier."

Other Perspectives

  • The platform's algorithms are periodically updated, which means that the incentives for creators can change over time, and what may be true at one point may not hold in the future.
  • The term "safe" is subjective, and what may seem safe or conventional to some may be innovative or edgy to others, depending on the context and the audience's expectations.
  • The pressure to post frequently is not solely due to algorithmic demands but also due to audience expectations and the desire for consistent engagement.
  • The idea of artistic integrity is personal and varies from creator to creator; what might be seen as a sacrifice for one could be a natural progression for another.
Irritation at YouTube's Opaque Rules and Inconsistent Enforcement

Many YouTubers, according to Bergen, grew frustrated with the opaque, ever-shifting policies relating to everything from how content was compensated to how offensive speech was removed. This frustration mounted during the advertiser boycotts, when creators’ incomes were slashed without clear explanations. And it worsened as a growing number of users depended on ad sales from YouTube for their livelihoods. They believed the company viewed them as simply replaceable cogs in its enormous machine—which, in many respects, was true.

Other Perspectives

  • The platform provides resources and communication channels, such as the YouTube Creator Academy and support teams, to help YouTubers understand and navigate policy changes.
  • Policies regarding content compensation and the removal of offensive speech are necessary to maintain a platform that is advertiser-friendly and accessible to a wide audience.
  • Some creators may have experienced an increase in viewership and support from their audience during advertiser boycotts, as viewers may rally around their favorite YouTubers in times of perceived unfair treatment.
  • YouTube may have provided explanations for income reductions, but the communication might not have been effectively received or understood by all creators.
  • While it's true that many users rely on YouTube ad sales for their livelihood, it's also important to recognize that diversification of income streams is a fundamental business strategy, and relying solely on one platform for income can be inherently risky.
  • The company has made efforts to communicate changes and updates to its policies, though not always perfectly, indicating an attempt to be transparent with its creator community.

YouTube's Push For Mainstream TV Appeal and Challenges in Moderation and Governance

Content Moderation Challenge on a User-Generated Video Platform With an Immense Amount of Content

Bergen details the evolution of YouTube's moderation strategy as the platform grew to an unprecedented size. Initially, the company could rely chiefly on human oversight, albeit with limited staff; later the company relied more heavily on algorithms and machine learning as the sheer volume of videos uploaded every minute soared. However, YouTube's attempts to police a platform with more languages and users than ever before inevitably led to mistakes, biases, and a growing awareness of the company’s outsized power in deciding what speech was permissible, what was censored, and what was promoted.

Early YouTube Content Moderation: How Squad Used Guidelines and Flags

The author highlights how YouTube tried to address problematic content. Concerns about content and conduct on YouTube have existed since the platform's inception. To address these, the company created SQUAD (Safety, Quality, and User Advocacy), a moderation team whose job it was to sift through footage reported by viewers and staff for violations of YouTube’s guidelines. Initially, moderators worked together in a compact workspace, relying on a manual of rules and guidelines to make judgement calls; later the operation grew to encompass reviewers in different continents working with an increasingly automated system.

Context

  • Concerns also arose about the platform's influence on public opinion, behavior, and its role in spreading misinformation or extremist content.
  • SQUAD's work was influenced by legal obligations and ethical considerations, balancing free expression with the need to protect users from harmful content.
  • At the time, technology for video analysis was limited, making human review essential for understanding context, intent, and potential harm in videos.
  • Initially, the moderation process was heavily reliant on human reviewers who manually assessed flagged content to determine if it violated guidelines.
  • Having reviewers across different continents allows for 24/7 content moderation, ensuring quicker response times to flagged content regardless of when it is reported.
Algorithms and AI for Removing Problematic Content

YouTube began deploying its algorithms and machine learning systems for content moderation as the volume of uploaded footage became too voluminous for humans to manage alone. These systems were far faster and more scalable than humans, capable of handling thousands of millions of hours of video footage in dozens of languages. However, YouTube's systems also struggled with nuanced, qualitative decisions about content, particularly on videos that straddled the line between commentary, criticism, and hate speech.

Context

  • Algorithms can quickly scan and flag content based on predefined criteria, such as detecting certain keywords or patterns that may indicate problematic content.
  • Algorithms apply rules uniformly without fatigue or bias, ensuring consistent moderation across all content.
  • These systems can be trained and improved over time to better recognize patterns and nuances in content, although they may still struggle with context and cultural subtleties.
  • The rapid pace at which new content is created and the diverse ways in which language is used online make it challenging for algorithms to stay updated and effective in real-time moderation.
Global Policing Challenges: Hate Speech, Extremism, Misinformation

The author chronicles how YouTube shifted its content moderation strategy to address political extremism and conspiracy theories in ways the company had avoided before, after several years of media and legal scrutiny. The company introduced new rules and algorithms after terrorism claims linked to its site and in response to public pressure in the wake of the 2016 presidential elections, which highlighted the influence YouTube had in disseminating and shaping public discourse. Naturally, Elsagate prompted a sweeping change in YouTube’s process.

Practical Tips

  • Develop a personal code of ethics for your online behavior. Write down a set of principles that guide how you interact with content and people online, including not sharing unverified information and reporting suspicious activities. This personal code acts as a compass for responsible digital citizenship and can influence others to consider their online actions more carefully.
  • Use sentiment analysis tools to gauge public opinion on topics relevant to you. Many online tools and services can analyze text from social media posts, comments, and reviews to determine the overall sentiment (positive, negative, or neutral) of the discussion. By regularly analyzing this data, you can get a sense of how the public feels about certain issues and decide if you need to make any changes to your approach or communication.
  • Educate your peers about the importance of digital vigilance by starting a virtual book club or discussion group focused on online safety. Choose books or articles that discuss the impact of digital content on society, and use these discussions to brainstorm ways to promote a safer online community. This not only spreads awareness but also encourages collective responsibility for the digital ecosystem.

YouTube's Efforts to Attract Hollywood's Premium Content

Bergen recounts YouTube’s quest to bring more professional content to its service as well as to lure advertisers who supported Hollywood and mainstream television. Despite its growing audience, YouTube struggled to attract media companies to its service. Several negotiations collapsed, hampered by Google's aversion to fees and a continued skepticism about the value of an online video platform built on an amateur creator model.

YouTube Originals: Financing Scripted Programs With Influential Creators

The author details how YouTube's pursuit of mainstream television viewers led the company to launch Originals, a slate of premium shows funded by YouTube and featuring well-known creators. YouTube's Hollywood representative, Robert Kyncl, wanted to replicate Netflix’s model of creating original programming, only with a very YouTubian approach. Instead of commissioning TV veterans as other streaming services did, YouTube bankrolled select creators, hoping to bring their enormous followings to paid platforms.

Practical Tips

  • Organize virtual watch parties for YouTube Originals, inviting friends and family who are primarily television viewers. By creating a communal viewing experience similar to traditional TV watching, you can introduce YouTube Originals to a new audience in a familiar and social setting, potentially converting them into regular YouTube viewers.
  • Develop a critical eye for premium content by watching and analyzing a variety of YouTube Originals and other high-quality web series. Take notes on what elements make them stand out, such as storytelling techniques, production values, and audience interaction. Use these observations to inform your own content creation or to provide constructive feedback to creators you follow.
  • Experiment with remixing existing works in a new context to spark originality. Take a concept from one domain and apply it to another to create something novel. For example, if you're a fan of a particular music genre, try incorporating its rhythm or style into a different medium, like creating a series of paintings inspired by the music's mood and tempo.
  • Collaborate with other content creators to produce a mini-series. Reach out to YouTubers who share similar interests or complement your content and propose a collaborative project. This could be a limited series with a specific theme or storyline. By pooling resources and cross-promoting, you can tap into each other's audiences and create a buzz similar to what a new Netflix series might generate.
  • Explore crowdfunding platforms to finance your creative project, mirroring the investment model used by YouTube for its Originals. By setting up a campaign on sites like Kickstarter or Indiegogo, you can pitch your idea to the public, offering rewards for different levels of support. This approach allows you to gauge interest in your project and secure funding without needing a corporate backer.
  • You can start a niche interest group on social media to attract a dedicated following. By focusing on a specific topic you're passionate about, you can create content that resonates with a particular audience. For example, if you're into vintage comic books, create a Facebook group or Instagram page where you share your collection, discuss character histories, and review classic stories. As your group's popularity grows, you can explore monetization options like exclusive memberships or merchandise.
YouTube Fights to Compete for Viewers and Advertisers With Streaming Giants

The author examines how the Originals series, while initially generating a groundswell of attention and marketing hype, ultimately struggled to gain a foothold against rivals. These new platforms lured creators and viewers with lucrative offers, greater security, and, importantly, professional infrastructure for production. YouTube Originals, despite their big budgets and state-of-the-art production facilities, didn't take off. For many YouTubers, working on YouTube Originals felt antithetical to the kind of work that made them successful.

Context

  • Streaming giants often invest in a wide variety of content, including exclusive series, films, and documentaries, appealing to diverse audiences and leveraging data analytics to tailor offerings.
  • These platforms invested in cutting-edge technology and high-quality production tools, enabling creators to produce content that met the high standards expected by audiences accustomed to traditional television and film.
  • Independent creators usually have full control over their content, including scripting, editing, and publishing. In contrast, YouTube Originals likely involve more oversight and input from producers and executives.

Bergen describes how YouTube responded to mounting legal and regulatory concerns, particularly those relating to copyright, harassment, and children’s privacy and safety. At first, YouTube was unready for issues these topics would cause. But as the site became a major force in media and commerce, the company needed to respond.

In the year 2007, Viacom, which owned MTV and several other entertainment properties, sued YouTube over violation of copyrights on its platforms. The suit, which YouTube ultimately won, spurred the company to develop effective copyright protection systems to appease the entertainment companies it had hoped to collaborate with. Content ID, an innovative tool for recognizing infringing content, became YouTube's most effective innovation in securing financial stability and legal legitimacy.

Context

  • Viacom's lawsuit claimed that YouTube had knowingly allowed users to upload thousands of videos containing Viacom's copyrighted material, seeking over $1 billion in damages.
  • The case set a precedent for how online platforms handle user-generated content and copyright issues, influencing other tech companies and shaping digital copyright law.
  • The introduction of Content ID marked a shift in YouTube's business model, as it began to focus more on partnerships with media companies and creators, offering them a share of ad revenue generated from their content.
  • For users, Content ID can sometimes lead to disputes over fair use, as the system may flag content that is legally permissible under copyright law, such as parodies or educational uses.
#YouTubeBlack Initiative and Efforts to Tackle Diversity Issues

The author details how YouTube responded to charges that it was failing its creators of color, particularly Black YouTubers. Akilah Hughes launched a harsh indictment, with her “YouTube Is Racist” article, prompting a #YouTubeBlack event and a company pledge to increase its support. However, Hughes and others felt the organization's efforts amounted to symbolic gestures, with its algorithms continuing to surface Black creators less often than those in its broader content pool.

Practical Tips

  • Create a playlist of videos from Black creators and share it on your social media platforms. This simple act can amplify their work and potentially introduce them to new audiences who may not have found them otherwise.
  • Create a simple browser extension that randomly selects videos from diverse creators. Use it to expose yourself to a wider range of perspectives and content, which can help counteract any unintentional bias in your viewing habits and support creators from various backgrounds.
  • Create a feedback channel for your community or audience where they can suggest topics or creators that deserve more attention. This could be a simple online form or an email address dedicated to collecting suggestions. Act on this feedback by highlighting these topics or creators in your content, thereby fostering a more inclusive environment.
  • You can evaluate the authenticity of a company's actions by tracking their follow-up initiatives. After a company makes a public statement or takes an action that seems symbolic, keep an eye on their subsequent moves. Do they implement new policies, invest in community programs, or show measurable progress towards the goals they've set? This will help you discern whether their initial actions were genuine or merely for show.
  • Advocate for algorithmic transparency by reaching out to social media companies. Draft and send letters or emails to these companies, asking for clarity on how their algorithms work and suggesting the implementation of regular audits to ensure racial equity. While you may not have the power to change the algorithms yourself, collective action and consumer pressure can lead to significant policy changes.
Elsagate and How YouTube Reacted to Child Exploitation

The author recounts how YouTube had to make sweeping service and policy changes after its Kids app was found to host exploitive, disturbing content targeted at children. The Elsagate scandal compelled YouTube to abandon its stance of neutrality, particularly when children were involved. YouTube executives faced the challenge of navigating a fine line between policing troubling material created by users and protecting free speech.

Practical Tips

  • You can enhance your online safety by regularly reviewing and updating your privacy settings on social media platforms. After learning about the Elsagate scandal, take the time to go through your accounts and ensure that your personal information is shared only with people you trust. For example, on Facebook, you might limit your past posts to friends only or customize who can see your future posts.
  • Collaborate with other parents to create a shared list of safe apps and content for children. By pooling knowledge and resources, you can collectively identify and recommend apps that are educational, entertaining, and free from exploitive material. This could be done through a shared online document or a private social media group where parents can post reviews and experiences with different apps.
  • Encourage critical thinking by discussing online content with your children. After they watch a video, ask them questions about what they saw, what they think about it, and if they understand the difference between advertisements and content. This can help them develop the skills to discern content quality and intent on their own.
  • Engage in community discussions on platforms like Reddit or Nextdoor to practice balancing free speech with community standards. Participate in or initiate conversations about controversial topics and observe how different opinions are expressed and moderated. This hands-on approach will give you insight into the challenges of content moderation and the importance of protecting free speech while maintaining a respectful dialogue.

External Events, Internal Dynamics, and Regulatory Pressures Affecting YouTube's Approaches

Arab Spring: YouTube's Role in Documenting Revolutions and Unrest

Bergen describes how the Arab Spring propelled YouTube onto the world’s news stage. With traditional media outlets restricted from covering the revolutions in Tunisia and Egypt, citizens turned to the platform to share footage, which quickly spread to TV. This brought a tide of new viewers to the platform and, critically, an image boost, moving the site away from its dogs-on-skateboards reputation toward becoming the go-to destination for social protest and meaningful world events.

Citizen Reporting: YouTube's Function in Exposing Events Ignored by Traditional Media

The author highlights YouTube's emergence as a tool of citizen journalism during the Arab Spring. When protesters filled the streets of Tunisia in January 2011, YouTube's cameras recorded it. The platform was already becoming a destination for videos documenting local events and newsworthy happenings—such as, in the US, a video exposing corruption at the military contractor Lockheed Martin—but the uprisings in the Middle East and North Africa took this genre to an entirely different scale. The Green Revolution in Iran would showcase YouTube's capability to document global political events.

Practical Tips

  • Create a collaborative YouTube playlist where multiple contributors can add videos related to a specific social issue or event. This can serve as a collective diary or archive that captures diverse viewpoints and experiences, offering a richer, community-driven narrative.
  • Engage in digital activism by creating online petitions or supporting existing ones on issues you're passionate about. Websites like Change.org make it easy to start a petition, and sharing it through your social networks can lead to significant impact. This can be a powerful way to show solidarity with movements globally and to contribute to change from your own home.
  • Organize a monthly virtual town hall meeting via video conferencing where community members can present and discuss recent local events. This fosters a sense of community and keeps everyone informed about what's happening in their area. You could use a free platform like Zoom or Skype and set up a recurring event, inviting residents to share stories, news, and updates from their neighborhood.
  • Enhance your media literacy by learning to identify credible sources and fact-check information you come across online. Websites like FactCheck.org or Snopes can be useful for verifying stories and claims. This skill will help you discern the validity of content related to corporate behavior and avoid spreading misinformation.
  • Volunteer to translate and subtitle citizen-reported videos to broaden their reach. By using your language skills to make these reports accessible to a non-native audience, you contribute to the global exchange of information and support the dissemination of diverse perspectives.
  • Use video editing apps to compile user-generated content from political events around the world into informative compilations. Reach out to contributors for permission to use their footage, then edit the videos to highlight common themes or contrasting perspectives on global political issues, helping to broaden viewers' understanding of international politics.
Dilemmas Of Moderating Conflict Zone Content

The author examines YouTube's challenges in moderating content and balancing its dedication to freedom of expression with the moral imperatives of screening violence. YouTube’s rules prohibited material showing graphic sexual content and severe violence, yet citizen journalists documenting street protests in Cairo and Tunis often uploaded jarring videos of citizens bloodied by police. The company depended on exemptions for videos reflecting “newsworthiness,” and it had to grapple with the limits of those exemptions when the Green Revolution, with its cell-phone footage sent out as a stream of information, turned violent.

Practical Tips

  • Create a simple blog to document and discuss non-violent responses to conflicts in your area. By focusing on peaceful protests, community dialogues, and conflict resolution strategies, you provide a platform that showcases alternative methods of addressing social issues, encouraging others to consider non-violent approaches.

Other Perspectives

  • The difficulty in moderating content from conflict zones may be overstated, as YouTube has the ability to employ advanced algorithms and machine learning tools to assist in identifying and filtering content.
  • The platform's reliance on user reporting and automated systems may result in uneven enforcement of content moderation policies, which can undermine the balance it seeks to achieve.
  • The definition of "severe violence" can be subjective, and what is considered severe by some may not be by others, leading to inconsistencies in content moderation.
  • Determining newsworthiness often requires quick judgment calls that may not always align with the public interest or journalistic standards.
  • The company's struggle with exemptions during violent events may not be due to the limits of the exemptions themselves, but rather the company's interpretation and application of these exemptions.
  • The term "stream of information" suggests a continuous and comprehensive flow, but in reality, the footage may have been sporadic and fragmented.

Gamergate Controversy and Alt-Right Emergence on YouTube

Bergen chronicles how Gamergate, an internet culture war over sexism and political correctness in the world of video games, spurred a growing force on the platform: the alt-right. Gamers provoked the anger of the "skeptic" YouTubers, a collective of atheist video creators and freethinkers who debated and ridiculed each other on the platform and attracted new viewers by tagging their titles with keywords. As the election of 2016 drew closer, figures like Stefan Molyneux, a self-described philosopher with a libertarian slant, began leaning into politics on the platform and building an enormous following of young men.

Fringe Political Voices Exploit YouTube Algorithms to Gain Followings

The author details how a faction of YouTubers leveraged YouTube's systems and its lack of a political presence. The emergence of the alt-right, with prominent figures such as Milo Yiannopoulos, a contributor to Breitbart News, and Stefan Molyneux, revealed how an increasingly fractured and chaotic internet could amplify voices the mainstream press often dismissed. Alongside YouTube's embrace of long-form video, this fueled these voices and the growth of their platforms.

Other Perspectives

  • The term "exploit" suggests a negative connotation, but savvy use of YouTube's features and understanding audience preferences can be seen as smart marketing rather than exploitation.
  • YouTube has community guidelines and policies that are intended to prevent hate speech and the spread of misinformation, which apply to all users regardless of their political stance.
  • Mainstream press dismissal does not necessarily lead to amplification on the internet; some voices remain marginalized despite the fragmented nature of the internet.
  • The growth of fringe political platforms may be due more to the content and messaging rather than the video format, as short-form videos can also go viral and gain a large following.
Distinguishing Political Content From Misinformation and Speech That Spreads Hate

This section reveals how YouTube's leadership and staff struggled to draw distinctions between legitimate political discourse and hateful speech and to build and implement moderation policies. While the company could more easily address outright calls for hate or violence, it tried to avoid placing limits on those who walked the line between provocative commentary and unacceptable conduct—or those, like PewDiePie, who recklessly used both.

Context

  • With hundreds of hours of video uploaded every minute, the sheer volume of content makes it challenging for YouTube to manually review and moderate all videos effectively.
  • The company is under constant scrutiny from governments, advocacy groups, and the public, each with differing opinions on what constitutes acceptable content.
  • There are numerous precedents and case studies of similar content being removed, which provide a framework for decision-making and enforcement.
  • Controversial content can affect monetization and partnerships for creators like PewDiePie, as brands and advertisers may distance themselves from content that could harm their reputation.

Trump Election and YouTube's Role in Spreading Conspiracies

Bergen chronicles how YouTube and its algorithms became a stage for spreading hate, fear, and conspiracies amid Trump’s 2016 political successes. Fringe voices used YouTube's systems—its search terms, suggestion algorithm, and long-form video format—to generate viewership around inflammatory topics like immigration and terrorism, often referencing Hillary Clinton as their target. At YouTube headquarters, however, the political implications of this trend were either misunderstood or underplayed.

How the Right Expanded: Stefan Molyneux and Alex Jones on YouTube

Molyneux and Alex Jones, a conspiracy theorist, found huge success with a formula that had worked so well for talk radio—a provocative style of delivering pronouncements on politics and the “truth”—to the YouTube platform. This formula, which succeeded by attacking the so-called mainstream press and elites, resonated with Trump supporters in the US. In this chapter Bergen explores how these pundits utilized YouTube to attack Hillary Clinton, the Democratic Nominee, and to spread dubious claims about her health and ties to globalists and financiers.

Context

  • Conspiracy theories often spread rapidly on social media due to their sensational nature. Algorithms on platforms like YouTube can amplify such content by recommending it to users who show interest in similar topics.

Other Perspectives

  • It's possible that their success is not just a result of their style but also the content of their messages, which may have appealed to certain viewers' pre-existing beliefs and biases.
  • Some Trump supporters may have been drawn to these pundits for reasons other than their attacks on the mainstream press and elites, such as seeking alternative viewpoints or dissatisfaction with political coverage elsewhere.
Company Struggles With Algorithmic Bias and Misleading Information

The author recounts how the site's algorithm promoted controversial videos and fringe political actors. The algorithms were designed to identify and prioritize content based on popularity and time viewed, though these systems ignored qualitative factors relating to accuracy, bias, or even the identities behind channels. This problem hit home soon after election interference by Russian agents became a prominent political scandal, sparking criticism that YouTube, like other platforms, had failed to moderate its role in affecting elections.

Other Perspectives

  • The issue of algorithmic bias is complex and multifaceted, and it may require a more nuanced approach that balances freedom of expression with the need to limit the spread of misinformation.
  • Prioritizing content based on popularity and time viewed is a business decision aimed at maximizing user engagement and, by extension, advertising revenue, which is a legitimate goal for a private company.
  • The responsibility for election interference may not rest solely on YouTube, as other actors, such as the individuals creating misleading content or state actors engaging in coordinated campaigns, play significant roles.

Regulatory Pressures to Reduce YouTube's Influence

Bergen explores how YouTube responded to two major crises that threatened its business model and required the company to implement sweeping changes: an advertisers’ boycott (the "Adpocalypse") and a monumental fine from the U.S government. The company's initial approach to these challenges—deploying more algorithms and engineers and emphasizing automation—did not always go over well with YouTube's creators or government officials, who wanted greater control or transparency.

"Adpocalypse" as Advertisers Flee YouTube Over Brand Safety Concerns

The author details how a series of news articles revealed the unintended consequence of YouTube’s advertising system: major brands, including Procter & Gamble, had inadvertently placed commercials next to videos espousing racism, violence, and Holocaust denial. This spurred a wave of advertiser boycotts – the “Adpocalypse” – that cost YouTube billions of dollars. The company responded by tightening restrictions on creators and their eligibility to have ads on videos, which angered many YouTubers but largely appeased advertisers.

Other Perspectives

  • The issue may highlight the need for better tools and controls for advertisers to manage where their ads appear, rather than an intentional oversight or negligence on the part of the brands.
  • The use of the term "Adpocalypse" might overshadow the positive outcomes of the situation, such as increased awareness and action towards responsible advertising and content moderation.
  • The financial losses attributed to the boycotts may not take into account the potential savings from not sharing ad revenue with creators whose content was deemed inappropriate, which could offset some of the lost income.
  • The implementation of tighter restrictions might lead to over-censorship, where content that is not necessarily harmful is also demonetized due to an overzealous application of the rules.
  • Creators who already adhered to community guidelines and produced advertiser-friendly content might have seen the tightening of restrictions as an opportunity to stand out and attract more advertisers.
  • Some advertisers might feel that the response was more of a quick fix rather than a long-term solution, and they could be skeptical about YouTube's commitment to continuously improving brand safety measures.
FTC's Fine Against YouTube For Violating Children's Privacy

The author documents how YouTube’s struggles with problematic content for kids—Elsagate—ultimately led to a monumental fine from the Federal Trade Commission (FTC) for violation of privacy laws. YouTube had introduced the app specifically for children under thirteen without adequately curating its selection, an approach that ran afoul of laws prohibiting sites and services aimed at children from collecting their data for use in targeted advertising. This was, in essence, YouTube’s business model.

Context

  • The Elsagate scandal drew significant media attention and public outcry, leading to increased scrutiny of YouTube's content moderation practices and its impact on children.
  • The FTC's action against YouTube was one of the largest fines ever imposed for COPPA violations, highlighting the seriousness of the issue and setting a precedent for future enforcement.
  • While YouTube Kids includes parental control features, critics argue that these measures were initially insufficient to prevent exposure to harmful content.
  • Identifying users under 13 can be challenging for platforms, leading to debates about the effectiveness and implementation of age verification systems.
  • The reliance on data collection for targeted advertising raises privacy concerns, especially for vulnerable groups like children. This has led to increased regulatory scrutiny and calls for more stringent privacy protections.
Regulating YouTube's Societal Impact

This section details how a growing contingent of citizens, commentators, and politicians argued that regulation was needed to curb both social media abuses and the outsized power the industry exercised in controlling personal data and online discourse. These concerns centered on several issues – the spread of misinformation and conspiracy theories, the exploitation of children, and the addictiveness of online platforms. Ultimately, governments decided to rein in Silicon Valley's power.

Practical Tips

  • You can start a digital literacy journal to track and reflect on your social media use, noting instances where regulation might benefit your experience. By keeping a daily log of the time spent on different platforms, the types of content encountered, and your emotional response to them, you'll develop a personal understanding of the potential impact of regulation. For example, if you notice a pattern of misleading information in your feed, this could highlight the need for better content oversight.
  • You can become a more discerning social media user by setting up personal content filters. Use existing social media platform tools to block or hide content that seems abusive or manipulative. For example, if you notice a pattern of negative content from certain sources, use the platform's features to mute, unfollow, or block those sources to create a healthier digital environment for yourself.
  • Engage in shareholder activism if you own stocks by voting on corporate policies that promote fair competition and regulation. Even as a small shareholder, you can join coalitions that influence corporate governance. For instance, if you own shares in a tech company, you can vote for measures that limit anti-competitive practices.
  • Start a "Misinformation Journal" to track false narratives you encounter. Whenever you come across a piece of questionable information, jot it down and note why it seemed suspicious. Over time, this will help you recognize patterns in misinformation and become more immune to deceptive tactics.
  • You can scrutinize product labels to ensure you're not supporting child labor by checking for certifications like Fair Trade or Rainforest Alliance. These certifications often indicate that the product was made without exploiting children, as they have strict standards for labor practices. For example, when buying chocolate or coffee, look for these labels to make an informed choice that aligns with ethical labor practices.
  • Create a personal reward system that incentivizes reduced screen time. Set daily or weekly screen time goals and reward yourself when you meet them. For example, if your goal is to spend no more than an hour on social media per day, treat yourself to a favorite activity or small indulgence when you succeed. This positive reinforcement can help you gradually reduce your reliance on online platforms.
  • You can support local tech initiatives to diversify the technology landscape. By choosing to use and promote tech products and services created by smaller, local companies, you contribute to a more balanced market. For example, if you're looking for a new messaging app, consider one developed by a local startup instead of the mainstream options.

Additional Materials

Want to learn the rest of Like, Comment, Subscribe in 21 minutes?

Unlock the full book summary of Like, Comment, Subscribe by signing up for Shortform .

Shortform summaries help you learn 10x faster by:

  • Being 100% comprehensive: you learn the most important points in the book
  • Cutting out the fluff: you don't spend your time wondering what the author's point is.
  • Interactive exercises: apply the book's ideas to your own life with our educators' guidance.

Here's a preview of the rest of Shortform's Like, Comment, Subscribe PDF summary:

Read full PDF summary

What Our Readers Say

This is the best summary of Like, Comment, Subscribe I've ever read. I learned all the main points in just 20 minutes.

Learn more about our summaries →

Why are Shortform Summaries the Best?

We're the most efficient way to learn the most useful ideas from a book.

Cuts Out the Fluff

Ever feel a book rambles on, giving anecdotes that aren't useful? Often get frustrated by an author who doesn't get to the point?

We cut out the fluff, keeping only the most useful examples and ideas. We also re-organize books for clarity, putting the most important principles first, so you can learn faster.

Always Comprehensive

Other summaries give you just a highlight of some of the ideas in a book. We find these too vague to be satisfying.

At Shortform, we want to cover every point worth knowing in the book. Learn nuances, key examples, and critical details on how to apply the ideas.

3 Different Levels of Detail

You want different levels of detail at different times. That's why every book is summarized in three lengths:

1) Paragraph to get the gist
2) 1-page summary, to get the main takeaways
3) Full comprehensive summary and analysis, containing every useful point and example