Podcasts > On Purpose with Jay Shetty > Addicted to Scrolling? 3 Small Changes to STOP Feeling Drained After Scrolling Social Media

Addicted to Scrolling? 3 Small Changes to STOP Feeling Drained After Scrolling Social Media

By iHeartPodcasts

In this episode of On Purpose, Jay Shetty examines how social media algorithms influence user behavior and mental health. He explores the mechanics behind these systems, which track user interactions to create personalized content feeds, and explains how they tend to amplify negative content—including how moral outrage increases content sharing by up to 8%, and how false news spreads six times faster than true stories.

Shetty discusses the psychological impact of these algorithms, from how they affect different demographics to their role in creating echo chambers, even on platforms without recommendation systems. The episode covers potential solutions to these challenges, including implementing chronological feeds, adding engagement friction points, and requiring algorithm transparency. It also addresses how users can develop better social media habits through practices like meditation and critical thinking.

Listen to the original

Addicted to Scrolling? 3 Small Changes to STOP Feeling Drained After Scrolling Social Media

This is a preview of the Shortform summary of the Dec 5, 2025 episode of the On Purpose with Jay Shetty

Sign up for Shortform to access the whole episode summary along with additional materials like counterarguments and context.

Addicted to Scrolling? 3 Small Changes to STOP Feeling Drained After Scrolling Social Media

1-Page Summary

Social Media Algorithms and Their Impact on Behavior

Social media algorithms are sophisticated systems designed to monitor and influence user behavior, as explained by Jay Shetty and various researchers. These algorithms track every interaction, from the duration of video views to hovering over comments, creating personalized content feeds that keep users engaged.

How Algorithms Shape User Experience

The algorithms excel at amplifying content that triggers emotional responses, particularly negative ones. According to Yale researchers, moral outrage in posts can increase shares and retweets by 5-8%. This creates a feedback loop where negative content gets boosted, leading to what Shetty describes as a cycle of negativity and division.

Psychological Impact and Mental Health

The algorithmic promotion of certain content types has significant psychological effects. For women, the algorithms often push appearance-focused content that can trigger insecurity. For men, TikTok studies show that exposure to misogynistic content can increase feelings of loneliness within just five days. The constant encouragement of self-comparison leads to elevated anxiety and self-doubt across all users.

User Psychology and Algorithm Reinforcement

Even without algorithmic recommendations, human psychology plays a crucial role in content spread. Shetty notes that false news stories are 70% more likely to be retweeted than true ones and reach people six times faster. The University of Amsterdam's research showed that even on platforms without recommendation algorithms, users naturally gravitate toward like-minded individuals, forming echo chambers.

Proposed Solutions

To address these challenges, Shetty recommends several solutions: implementing chronological feeds by default, adding friction points to encourage mindful engagement, and requiring algorithm transparency. For instance, Twitter's experiment with prompting users to read articles before retweeting led to a 40% increase in reading behavior. Additionally, Shetty suggests that users can practice emotional mastery through meditation and critical thinking to better navigate social media's challenges.

1-Page Summary

Additional Materials

Clarifications

  • Jay Shetty is a former monk turned motivational speaker and author known for his insights on mindfulness and human behavior. He has a large social media following and often discusses how technology affects mental health. His perspective is relevant because he combines spiritual wisdom with modern psychology to address social media's impact. This blend gives practical advice on managing digital well-being.
  • Social media algorithms are computer programs that analyze user data to predict and prioritize content you are most likely to engage with. They use machine learning techniques to continuously improve these predictions based on your interactions, such as likes, shares, and time spent on posts. These algorithms rank and filter vast amounts of content in real-time to create a personalized feed tailored to your preferences. Their goal is to maximize user engagement by showing content that keeps you active on the platform longer.
  • Moral outrage is a strong emotional reaction to perceived injustice or wrongdoing. It motivates people to share content to express their values and seek social support. This emotional intensity makes posts more engaging and likely to spread quickly. Social media algorithms detect this engagement and prioritize such content, increasing its visibility.
  • A feedback loop in social media occurs when the algorithm promotes content that generates strong reactions, causing users to engage more with similar content. This increased engagement signals the algorithm to show even more of that type of content, reinforcing user behavior. Over time, this cycle intensifies certain emotions or viewpoints, often amplifying negativity or division. It creates a self-reinforcing pattern where content popularity drives further exposure.
  • Appearance-focused content often leads women to compare themselves unfavorably to idealized images, increasing body dissatisfaction. This can contribute to lower self-esteem and higher rates of anxiety and depression. Repeated exposure may also reinforce unrealistic beauty standards, intensifying feelings of inadequacy. These psychological effects can negatively impact overall mental health and well-being.
  • Misogynistic content on TikTok includes videos or comments that demean, objectify, or promote negative stereotypes about women. Examples are clips that mock women's appearances, reinforce gender roles, or spread harmful myths about female behavior. Such content can subtly influence viewers by normalizing disrespect and inequality. This exposure contributes to feelings of loneliness and insecurity among male viewers.
  • False news spreads faster because it often contains novel, surprising information that captures attention more effectively than familiar truths. People are psychologically drawn to emotionally charged content, which false news frequently exploits by triggering fear, anger, or excitement. Additionally, confirmation bias leads individuals to share information that aligns with their existing beliefs without verifying accuracy. Social media platforms amplify this by prioritizing engaging content, regardless of its truthfulness.
  • Echo chambers are environments where people are exposed only to opinions that match their own, reinforcing their existing beliefs. They form naturally because individuals tend to connect with others who share similar views, a behavior known as homophily. This occurs even without algorithms, as people seek comfort and validation in like-minded groups. Over time, this limits exposure to diverse perspectives and strengthens group consensus.
  • Chronological feeds display posts in the exact order they were published, showing the newest content first without filtering. Algorithmic feeds use complex software to prioritize and show content based on predicted user interest, engagement, and behavior patterns. This means algorithmic feeds often reorder posts to highlight what the system thinks will keep users engaged longer. Chronological feeds offer a neutral timeline, while algorithmic feeds personalize and curate content dynamically.
  • "Friction points" are deliberate obstacles or extra steps added to social media actions to slow down impulsive behavior. They might include prompts asking users to confirm before sharing or requiring a pause before posting. These points encourage users to think more carefully about their interactions. The goal is to reduce mindless engagement and promote more thoughtful content sharing.
  • Algorithm transparency means making the rules and processes behind content selection visible to users and regulators. It helps people understand why they see certain posts, reducing manipulation and building trust. Implementation can include publishing algorithm criteria, offering user controls to adjust content preferences, and independent audits. Transparency also enables accountability, encouraging platforms to prioritize ethical design.
  • In 2020, Twitter tested a feature that prompted users to read an article before retweeting it. The prompt appeared if a user tried to retweet a link without opening it first. This nudge aimed to reduce the spread of misinformation by encouraging informed sharing. The experiment showed a 40% increase in article reading before retweets.
  • Emotional mastery is the ability to recognize, understand, and regulate your emotions effectively. Meditation helps by increasing mindfulness, allowing users to observe their reactions without immediately acting on them. Critical thinking encourages questioning the content and motives behind posts, reducing impulsive sharing or emotional responses. Together, these skills help users engage with social media more thoughtfully and reduce negative emotional impact.

Counterarguments

  • While social media algorithms are designed to keep users engaged, it's not solely the algorithms that are responsible for user behavior; users have agency and can choose how they interact with social media.
  • The claim that algorithms primarily amplify negative content could be oversimplified, as they also promote positive and neutral content that engages users.
  • The feedback loop of negativity might not be solely due to algorithms but also due to users' own biases and preferences for engaging with such content.
  • The impact of appearance-focused content on women's insecurity is a complex issue that may involve societal pressures beyond social media algorithms.
  • The study linking TikTok exposure to increased loneliness in men may not account for other variables in users' lives that contribute to loneliness.
  • The assertion that false news spreads faster due to human psychology might not consider efforts by social media platforms to counteract misinformation and educate users.
  • The formation of echo chambers could be a result of user choice and the desire for community rather than just algorithmic sorting.
  • Chronological feeds might not necessarily reduce the negative impacts of social media, as users could still seek out and engage with harmful content.
  • Adding friction points could potentially inconvenience users and lead to a decrease in platform engagement, which might not be in the best interest of social media companies.
  • Algorithm transparency is a complex issue that involves trade-offs between user privacy, intellectual property, and the effectiveness of the algorithms themselves.
  • The effectiveness of Twitter's experiment with prompting users to read articles before retweeting might not be generalizable across all platforms or types of content.
  • Emotional mastery techniques like meditation and critical thinking are personal strategies that may not address the systemic issues related to social media's design and business models.

Get access to the context and additional materials

So you can understand the full picture and form your own opinion.
Get access for free
Addicted to Scrolling? 3 Small Changes to STOP Feeling Drained After Scrolling Social Media

Social Media Algorithms and Their Impact on Behavior

Jay Shetty and other sources reveal that social media algorithms are intricately designed to amplify engaging content, exploit psychological biases, and ultimately have significant effects on users’ behavior, mental health, and well-being.

Algorithms Amplify Engaging Content on Social Media

Social media platforms like TikTok are known for closely monitoring user interactions such as pauses, clicks, likes, shares, and even the duration of hovering over videos or comments. These algorithms record behaviors to predict and promote content users are likely to engage with, thereby creating a feedback loop to retain users.

Algorithms Track User Behaviors to Predict Content Engagement

The algorithm prioritizes user engagement, gauging interest by tracking watches to the second, rewatches, and user interactions. Content that garners more emotional engagement is promoted to more people, reinforcing their preferences and behaviors.

Algorithms Amplify Engaging Content, Creating a Feedback Loop to Retain Users

Content with emotional resonance is likely to be amplified by the algorithm, potentially going viral. Features like autoplay extend watch time, often without direct user choice, facilitating the consumption of continuous engaging content. Jay Shetty notes that this results in a cycle of negativity, as doomscrolling and the high contagiousness of anger lead to amplifying divisive content.

Algorithms Exploit Psychology and Biases to Increase Engagement

Social media algorithms are manipulated to exploit innate psychological tendencies and biases, which often increases engagement but can also lead to unintended consequences.

Algorithms Exploit Negativity Bias, Boosting Engagement With Negative Content

Shetty discusses how algorithms exploit the negativity bias by boosting engagement with negative content. Yale researchers found that posting moral outrage online leads to social rewards, prompting individuals to post more outrage content. Each additional negative word in a post is linked with a 5 to 8% increase in shares and retweets.

Algorithms Exploit Users' Tendency to Reward Moral Outrage, Leading To More Outrage Content

Outrage online is often rewarded with likes and retweets, which encourages people to post even more outraged content. This phenomenon signals loyalty to one's social group and equates to a feeling of belonging.

Algorithms Push Polarizing Content, Steering Users Into Echo Chambers

Steering users towards extremist and divisive content can create echo chambers. Mozilla's research indicated that volunteers were directed to extremist content even from neut ...

Here’s what you’ll find in our full summary

Registered users get access to the Full Podcast Summary and Additional Materials. It’s easy and free!
Start your free trial today

Social Media Algorithms and Their Impact on Behavior

Additional Materials

Clarifications

  • Negativity bias is a psychological phenomenon where negative events or information have a greater impact on a person's thoughts and feelings than positive ones of equal intensity. This bias evolved as a survival mechanism, making humans more alert to threats and dangers. In social media, it causes users to pay more attention to negative content, increasing its spread and engagement. As a result, algorithms prioritize such content to keep users hooked.
  • Moral outrage on social media refers to strong emotional reactions to perceived ethical violations or injustices shared online. It often motivates users to express anger or condemnation publicly, seeking social validation. This behavior can amplify divisive content and deepen social polarization. Platforms reward such posts with more visibility, reinforcing the cycle.
  • Algorithms track "hover duration" by measuring how long a user's cursor or finger stays over a specific piece of content without clicking. This indicates interest or curiosity, even if the user does not interact further. Longer hover times signal to the algorithm that the content is engaging or relevant. Consequently, the algorithm may prioritize similar content to keep the user engaged.
  • Doomscrolling is the habit of continuously consuming negative news or content online, often leading to increased stress. It triggers the brain’s stress response by flooding it with alarming information, raising cortisol levels. This can cause feelings of helplessness, anxiety, and depression over time. The behavior is reinforced by algorithms that prioritize emotionally charged content, making it hard to stop.
  • Echo chambers are online spaces where users are exposed mainly to opinions that match their own, limiting exposure to differing views. They form because algorithms prioritize content similar to what users have previously engaged with, reinforcing existing beliefs. This selective exposure reduces critical thinking and increases polarization. Over time, users become isolated within these homogeneous information bubbles.
  • Autoplay automatically plays the next video or content without the user needing to click anything. This feature keeps users watching longer by removing pauses between videos. It exploits users' passive attention, making it easy to consume more content than intended. Autoplay increases overall engagement metrics, benefiting the platform's algorithm.
  • Cortisol is a hormone released by the body in response to stress. Elevated cortisol levels can increase feelings of anxiety and tension. Social media use, especially doomscrolling, can trigger stress responses, raising cortisol. This physiological reaction contributes to heightened anxiety and emotional distress.
  • Social rewards like likes and retweets activate the brain's reward system, releasing dopamine, which creates feelings of pleasure. This positive reinforcement encourages users to repeat behaviors that earn social approval. Over time, seeking these rewards can shape online behavior and content sharing patterns. It also strengthens social bonds by signaling acceptance and belonging within a community.
  • Polarizing content is designed to create strong emotional reactions by highlighting extreme or divisive viewpoints. Unlike regular content, it often frames issues in a way that splits opinions sharply, encouraging conflict rather than consensus. This type of content increases engagement by provoking debate and reinforcing existing beliefs, w ...

Counterarguments

  • Social media algorithms are designed to reflect user preferences, and the content they amplify is often a reflection of what users demonstrate they want to see through their engagement.
  • Not all engaging content is negative or harmful; algorithms also promote positive and educational content that can have beneficial impacts on users.
  • Users have agency and control over their social media use, including the ability to customize their feeds, follow or unfollow certain accounts, and use platform tools to manage the content they see.
  • The relationship between social media use and mental health is complex and not solely determined by algorithms; individual differences and offline factors also play significant roles.
  • Some research suggests that social media can have positive effects on well-being by providing social support, enabling self-expression, and facilitating community building.
  • The concept of echo chambers is debated, with some studies suggesting that social media exposure can actually increase exposure to diverse viewpoints, rather than isolating users in ideological bubbles.
  • The impact of algorithms on behavior is not deterministic; users can be critical of the content they consume and resist the influence of algorithms through conscious consumption practices.
  • Social media companies are ...

Get access to the context and additional materials

So you can understand the full picture and form your own opinion.
Get access for free
Addicted to Scrolling? 3 Small Changes to STOP Feeling Drained After Scrolling Social Media

Role of Psychology and Choice in Reinforcing Algorithms

Psychology and individual choices play a crucial role in the cycle of reinforcement executed by social media algorithms, highlighting how user behavior can fuel the spread of sensational and biased content.

User Engagement Fuels Algorithms' Negative Effects

Algorithms are engineered to maximize user retention by presenting content that keeps individuals on the platform for extended periods. Through design features like autoplay, the element of an active decision-making process is reduced, encouraging increased watch time without conscious user choices.

Users Prefer Clicks and Shares on Sensational or False Stories, Amplified by Algorithms

Jay Shetty remarks that false news stories are 70% more likely to be retweeted than true ones and reach 1,500 people six times faster, illustrating that user engagement with sensational or false content is a significant factor in algorithmic amplification. This cyclic relationship between users' interactions and algorithms contributes to the widespread dissemination of such content.

Bias-Confirming Content Reinforced by Algorithms

Algorithms contribute to reinforcing users' biases by presenting them with content that similar users are engaging with, under the assumption that it will maintain their attention. Studies show that both liberals and conservatives are more likely to click on links that confirm their biases—21% and 30%, respectively—than on content that challenges their views.

Human Psychology Drives Echo Chambers and Spreads Extreme Content

Even in the absence of recommendation algorithms, user psychology exhibits tendencies that facilitate the formation of echo chambers and spread extreme content.

Researchers Created a Social Media Platform Without Recommendation Algorithms; Users Still Engaged With Like-Minded Individuals

In a study by the University of Amsterdam, social media platform ...

Here’s what you’ll find in our full summary

Registered users get access to the Full Podcast Summary and Additional Materials. It’s easy and free!
Start your free trial today

Role of Psychology and Choice in Reinforcing Algorithms

Additional Materials

Clarifications

  • Reinforcement in social media algorithms refers to the process where the system learns from user interactions to repeatedly show similar content that keeps users engaged. This creates a feedback loop, strengthening preferences and behaviors by rewarding certain types of content with more visibility. Over time, this can intensify exposure to specific ideas or emotions, often amplifying sensational or biased material. The goal is to maximize user time on the platform by reinforcing what captures their attention.
  • Algorithms maximize user retention by analyzing user behavior to predict and show content that is most likely to keep them engaged. Autoplay is a feature that automatically plays the next video or content without requiring the user to click, reducing the need for active decisions. This seamless transition encourages longer viewing sessions by minimizing interruptions. Together, these mechanisms exploit psychological triggers like curiosity and reward to sustain attention.
  • Jay Shetty is a well-known motivational speaker and author who often discusses social media and psychology. The statistics he cites are based on research from a 2018 study by MIT, which analyzed the spread of true and false news on Twitter. This study found that false news spreads faster and reaches more people than true news due to its novelty and emotional impact. Understanding Shetty's background and the study source helps validate the credibility of these statistics.
  • "Bias-confirming content" refers to information that aligns with a person's existing beliefs or opinions. People prefer it because it provides psychological comfort and reinforces their worldview, reducing cognitive dissonance. This preference helps maintain a consistent sense of identity and reduces mental effort needed to process conflicting information. As a result, individuals are more likely to engage with and trust content that supports their biases.
  • AI-powered chatbots in the University of Amsterdam study simulated human users with distinct opinions to test social interactions without algorithmic influence. They autonomously engaged with other bots sharing similar views, mimicking natural human behavior. This setup helped isolate the effect of human psychology from algorithmic recommendations in forming echo chambers. The study demonstrated that even without algorithms, like-minded interactions naturally occur.
  • Echo chambers are environments where people are exposed only to information and opinions that reinforce their existing beliefs. They form because individuals tend to seek out and trust sources that confirm their views, avoiding contradictory perspectives. Social media algorithms and human psycholog ...

Counterarguments

  • The extent to which algorithms influence user behavior is complex, and some argue that blaming algorithms for user choices oversimplifies the issue. Users have agency and can choose to engage with diverse content.
  • The impact of sensational or false stories may be overstated, as many users are capable of critical thinking and can discern between credible and non-credible sources.
  • The claim that false news spreads faster and is more likely to be retweeted may not account for the full context, such as the role of bots or coordinated campaigns in spreading misinformation.
  • Algorithms are not the sole factor in reinforcing biases; they reflect and amplify existing societal biases that would exist even in their absence.
  • The study involving AI chatbots may not accurately represent human behavior, as bots lack the complexity of human psychology and social interactions.
  • The effectiveness of recommendation algorithms in reducing partisan engagement might be underestimated, as there could be other factors at play that weren't accounted for in the study. ...

Get access to the context and additional materials

So you can understand the full picture and form your own opinion.
Get access for free
Addicted to Scrolling? 3 Small Changes to STOP Feeling Drained After Scrolling Social Media

Solutions to Address Negative Effects of Social Media

In response to the widespread issues linked to social media, including political polarization and the spread of misinformation, Jay Shetty proposes various solutions aimed at combating these negative effects. These solutions involve changes to how social media platforms operate and how users interact with them.

Social Media Should Default To Chronological Feeds and Offer Transparent Algorithm Controls

Shetty suggests that social media platforms should default to chronological feeds, which have been shown to cut political polarization and misinformation exposure. While this change might result in decreased user engagement, it could have a positive impact by providing users with a more unfiltered view of the content from those they follow.

Chronological Feeds Cut Political Polarization and Misinformation, Though Engagement Drops

The simplicity of chronological feeds helps users to see posts in the order they are published as opposed to being curated by an engagement-driven algorithm. By seeing posts in sequence, users are less likely to be served content that could aggravate political divisions or propagate false information.

Add Friction To Social Media For Mindful Engagement

Social media platforms can encourage more mindful engagement by adding friction points that disrupt automatic consumption of content. For example, disabling autoplay features could lead to decreased session lengths, ultimately resulting in less exposure to misinformation and more intentional use of the platform.

Reading or Watching Content First, and Sharing Limits, Could Reduce Misinformation Spread

One approach is to introduce prompts that encourage users to read or view content before sharing it. For instance, Twitter's experiment asking users to read articles before retweeting led to a 40% increase in reading before sharing. Additionally, WhatsApp's implementation of forwarding limits significantly reduced the spread of misinformation in India by adding a simple friction point to sharing.

Social Media Must Offer Algorithm Transparency and Independent Audits

Social media companies should be transparent about their algorithms and how they prioritize content. This information should be available for external research to evaluate the impact of algorithms on user behavior and information dissemination. As an example, the European Union's Digital Services Act req ...

Here’s what you’ll find in our full summary

Registered users get access to the Full Podcast Summary and Additional Materials. It’s easy and free!
Start your free trial today

Solutions to Address Negative Effects of Social Media

Additional Materials

Clarifications

  • Chronological feeds show posts in the exact order they are published, without filtering or prioritizing content. Algorithm-driven curation uses computer programs to select and rank posts based on factors like user behavior, engagement, and preferences. These algorithms aim to keep users engaged by showing content they are more likely to interact with, which can create echo chambers or reinforce biases. As a result, users may see less diverse viewpoints and more sensational or polarizing content.
  • Chronological feeds show posts in the exact order they are published, preventing algorithms from prioritizing sensational or divisive content. This reduces echo chambers by exposing users to a broader range of viewpoints rather than reinforcing existing biases. Algorithms often amplify emotionally charged or misleading posts to increase engagement, which chronological feeds avoid. As a result, users encounter less manipulated content, lowering political polarization and misinformation.
  • Adding friction means introducing small obstacles or extra steps that slow down users' automatic or impulsive actions on social media. Examples include requiring users to confirm before sharing a post, limiting the number of times a message can be forwarded, or adding delays before posting comments. These measures encourage users to think more carefully about their interactions, reducing impulsive spreading of misinformation. Friction helps promote mindful and deliberate engagement rather than rapid, unconsidered activity.
  • Twitter's experiment aimed to reduce the spread of misinformation by encouraging users to read articles before sharing them. This nudge helped users engage more thoughtfully with content, leading to more informed sharing decisions. The 40% increase in article reading before retweeting showed that small design changes can influence user behavior. It demonstrated a practical way to promote responsible information sharing on social media.
  • WhatsApp's forwarding limits restrict how many times a message can be forwarded to different chats. This reduces the rapid spread of viral messages, which often include misinformation. By limiting forwarding, it becomes harder for false information to reach large audiences quickly. This measure encourages users to think before sharing unverified content.
  • Algorithms on social media are sets of rules that decide which posts you see based on your behavior and preferences. They prioritize content to maximize engagement, often showing sensational or emotionally charged posts. Transparency is important because it allows users and researchers to understand and evaluate how these algorithms influence opinions and information spread. Without transparency, harmful effects like bias and misinformation can go unchecked.
  • Independent audits of algorithms involve external experts examining how social media platforms' algorithms select and prioritize content. These audits assess whether algorithms promote harmful content, bias, or misinformation. The findings help regulators and the public hold platforms accountable for their impact on society. This process encourages platforms to improve transparency and reduce negative effects.
  • The European Union's Digital Services Act (DSA) is a law designed to create safer online spaces by regulating how digital platforms manage content and user data. It requires large social media companies to be transparent about their algorithms and content mode ...

Counterarguments

  • Chronological feeds may not completely solve the issue of political polarization or misinformation, as users can still choose to follow polarizing or unreliable sources.
  • Users might find chronological feeds less engaging or relevant, which could lead to a decline in platform use and a potential decrease in the diversity of content they are exposed to.
  • Adding friction points to social media could be seen as paternalistic or annoying to users who prefer a seamless experience, potentially driving them to other platforms.
  • The effectiveness of prompts to read or watch content before sharing may diminish over time as users become accustomed to them and potentially ignore or bypass them.
  • Sharing limits could be circumvented by determined users and may not address the root causes of why misinformation is created and spread.
  • Algorithm transparency might not be sufficient if the public lacks the expertise to understand the algorithms or if the information provided is too complex or voluminous to be practical for external analysis.
  • Independent audits of algorithms could be limited by the proprietary nature of the algorithms, and companies may resist sharing sensitive information that could compromise their competitive advantage.
  • Curating social media feeds to follow diverse voices assumes users have the desire and ability to engage with perspective ...

Get access to the context and additional materials

So you can understand the full picture and form your own opinion.
Get access for free

Create Summaries for anything on the web

Download the Shortform Chrome extension for your browser

Shortform Extension CTA