{"id":144770,"date":"2025-07-01T12:42:18","date_gmt":"2025-07-01T16:42:18","guid":{"rendered":"https:\/\/www.shortform.com\/blog\/?p=144770"},"modified":"2026-04-26T14:52:35","modified_gmt":"2026-04-26T18:52:35","slug":"ruha-benjamin-race-after-technology","status":"publish","type":"post","link":"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/","title":{"rendered":"Ruha Benjamin&#8217;s Race After Technology: Book Overview"},"content":{"rendered":"\n<p>Is racism embedded in the digital technologies that increasingly shape our daily lives? Do algorithms and AI systems perpetuate centuries-old patterns of discrimination?<\/p>\n\n\n\n<p>Ruha Benjamin&#8217;s <em>Race After Technology: Abolitionist Tools for the New Jim Code<\/em> addresses these urgent questions. Benjamin&#8217;s work aims to show how seemingly neutral digital systems\u2014from hiring software to healthcare algorithms\u2014actually amplify racial inequalities in new and often invisible ways.<\/p>\n\n\n\n<p>Keep reading for an overview of this thought-provoking book.<\/p>\n\n\n\n<!--more-->\n\n\n\n<h2 class=\"wp-block-heading\" id=\"h-overview-of-ruha-benjamin-s-race-after-technology\">Overview of Ruha Benjamin&#8217;s <em>Race After Technology<\/em><\/h2>\n\n\n\n<p>Ruha Benjamin&#8217;s <a href=\"https:\/\/aas.princeton.edu\/publications\/research\/race-after-technology-abolitionist-tools-new-jim-code\" target=\"_blank\" rel=\"noreferrer noopener\"><em>Race After Technology: Abolitionist Tools for the New Jim Code<\/em><\/a> argues that race is a technology\u2014a tool we use to organize society into hierarchies that benefit the people who have the most power. This technology has evolved over centuries, and one way it operates today is through digital systems, from search engines to surveillance systems to predictive algorithms. As these systems increasingly shape our access to employment, healthcare, housing, and justice, they amplify historical patterns of discrimination\u2014while hiding behind claims of objectivity and fairness. Benjamin refers to this phenomenon as \u201c<a href=\"https:\/\/www.shortform.com\/blog\/hub\/science\/what-is-the-new-jim-code\/\">the New Jim Code<\/a>\u201d and characterizes it as the latest evolution in America\u2019s long history of racial control and discrimination.<\/p>\n\n\n\n<p>Benjamin is a professor of African American Studies at Princeton University and founding director of the <a href=\"https:\/\/www.thejustdatalab.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">Ida B. Wells Just Data Lab<\/a>. When she published <em>Race After Technology<\/em> in 2019, facial recognition systems were already misidentifying Black faces at alarming rates, predictive policing algorithms were disproportionately targeting Black neighborhoods, and hiring algorithms were reproducing historical patterns of employment discrimination. Benjamin argues that these weren\u2019t isolated failures, but systemic effects of the New Jim Code.<\/p>\n\n\n\n<p><a href=\"https:\/\/www.shortform.com\/app\/book\/race-after-technology\/preview\" rel=\"nofollow\">Shortform&#8217;s guide to Ruha Benjamin&#8217;s <em>Race After Technology<\/em><\/a> unpacks her argument that racial categories function as technologies of control, explores how the New Jim Code embeds racism into technology, and examines her recommendations for race-conscious design and abolitionist tools.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>What Is Race? What Is Racism?<\/strong><\/h3>\n\n\n\n<p><strong>Benjamin argues that race is a powerful social technology used to separate people into groups, stratify those groups, and explain away injustice. <\/strong>Racism functions as the operating system for this technology: the set of beliefs, practices, and structures that make race \u201cwork\u201d as a tool of control. Throughout American history, this operating system has been deliberately engineered to maintain hierarchies of power and to justify inequitable distributions of resources.<\/p>\n\n\n\n<p>Benjamin explains that, <strong>like other technologies, race (and the system of racism that makes it function) has gone through multiple iterations<\/strong>, each designed to maintain racial control when previous versions faced challenges or resistance. When one form of racial control becomes socially or politically untenable, racism doesn\u2019t disappear\u2014it adapts. Each new iteration is less visible and more difficult to challenge than the last. To illustrate this, Benjamin traces the evolution of racism through three major phases; let\u2019s explore each.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\"><strong>Slavery: Denying Humanity<\/strong><\/h4>\n\n\n\n<p>Benjamin explains that <strong>racism emerged as a way to reconcile the glaring contradiction between America\u2019s proclaimed ideals of liberty and equality and the brutal reality of enslavement.<\/strong> By denying the full humanity of Black people, America could maintain both its democratic rhetoric and its <a href=\"https:\/\/www.shortform.com\/blog\/racial-hierarchy-white-fragility\/\">racial hierarchy<\/a>. This earliest, most explicit form of racism required little disguise: It operated through open claims of racial inferiority and dehumanization.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\"><strong>Jim Crow: Explicit Denial of Rights<\/strong><\/h4>\n\n\n\n<p>When slavery ended in 1865, <strong><a href=\"https:\/\/www.shortform.com\/blog\/jim-crow-segregation-nasa\/\">Jim Crow laws<\/a> evolved to serve the same purpose: maintaining <a href=\"https:\/\/www.shortform.com\/blog\/white-male-supremacy\/\">white supremacy<\/a><\/strong> by explicitly denying Black Americans access to voting, education, housing, and economic opportunities. Benjamin explains that these laws ensured Black Americans remained a subordinate class, despite formal freedom. Though less extreme than slavery, <a href=\"https:\/\/www.shortform.com\/blog\/jim-crow-racism\/\">Jim Crow racism<\/a> remained explicit: The laws openly specified different treatment based on race.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\"><strong>The New Jim Crow: Implicit Denial of Rights<\/strong><\/h4>\n\n\n\n<p>As civil rights legislation dismantled explicit segregation in the 1960s, a new racist technology emerged. Known as the <a href=\"https:\/\/www.shortform.com\/app\/book\/the-new-jim-crow\" rel=\"nofollow\"><em>New Jim Crow<\/em><\/a>, from the 2010 book of the same name by Michelle Alexander, this system of discrimination operated through <strong>ostensibly \u201crace-neutral\u201d policies that nevertheless perpetuated racial inequalities\u2014primarily the War on Drugs.\u00a0<\/strong><\/p>\n\n\n\n<p>The War on Drugs was a federal effort to combat drug use and distribution via stricter law enforcement. Policies associated with this effort didn\u2019t mention race but led to the mass incarceration of Black Americans, largely because laws mandated harsher penalties for crack cocaine, which was more common in Black communities, than for powder cocaine, which was more common in white communities.&nbsp; <strong>Such policies exacerbated racial inequality while allowing America to claim it had moved beyond racism after the <a href=\"https:\/\/www.shortform.com\/blog\/black-power-and-civil-rights-movement\/\">Civil Rights Movement<\/a><\/strong>.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>What Is \u2018The New Jim Code\u2019?<\/strong><\/h3>\n\n\n\n<p>After explaining the history of racism in the US, Benjamin contends that we\u2019ve entered a new phase in the evolution of racism. She calls this \u201cthe New Jim Code,\u201d a term that echoes Alexander\u2019s \u201cNew Jim Crow\u201d to emphasize the continuity of racial control mechanisms throughout American history. <strong>This latest iteration involves racism embedded in digital technologies and algorithms<\/strong>\u2014for example, facial recognition software that struggles to accurately identify darker-skinned faces and risk assessment algorithms that disproportionately flag Black individuals as \u201chigh risk\u201d for committing crimes.<\/p>\n\n\n\n<p><strong>Digital technology increasingly mediates access to opportunities and resources, so when these systems embed racial biases, they exacerbate inequalities<\/strong>. Consider healthcare algorithms that determine patient care: When these systems use past medical spending as a proxy for medical need, they <a href=\"https:\/\/www.aclu.org\/news\/privacy-technology\/algorithms-in-health-care-may-worsen-medical-racism\" target=\"_blank\" rel=\"noreferrer noopener\">recommend less care for Black patients<\/a> than for white patients with the same symptoms\u2014not because Black patients are healthier, but because historical <a href=\"https:\/\/www.shortform.com\/blog\/racial-disparities-in-healthcare\/\">racism in healthcare<\/a> meant they had less access to expensive treatments in the past. Similarly, mortgage-lending algorithms trained on historical loan data can perpetuate decades of redlining by <a href=\"https:\/\/news.lehigh.edu\/ai-exhibits-racial-bias-in-mortgage-underwriting-decisions\" target=\"_blank\" rel=\"noreferrer noopener\">denying loans to qualified applicants<\/a> in predominantly Black neighborhoods.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\"><strong>The Invisibility of the New Jim Code<\/strong><\/h4>\n\n\n\n<p>Benjamin argues that<strong> the New Jim Code is particularly insidious because we think of technology as neutral, objective, and fair<\/strong>. She explains that the algorithms that drive modern technology operate through numbers, statistics, and code rather than through explicit racial categories, so their outputs seem data-driven rather than opinions-based. This veneer of scientific authority shields discriminatory outcomes from scrutiny: We\u2019re more likely to question a hiring manager\u2019s judgment than an algorithm\u2019s determination that certain candidates are \u201cnot a good fit.\u201d The technology\u2019s complexity also creates plausible deniability: Developers can claim they never programmed the algorithm to discriminate, even when that\u2019s precisely what it does.<\/p>\n\n\n\n<p>For these reasons, many people reject the idea that technology can perpetuate racism. However, Benjamin argues that <strong>a system doesn\u2019t have to be built by someone with explicit racial animus or malicious intent to produce racist outcomes<\/strong>. She understands racism as a systemic force rather than a personal attitude and argues we should judge systems by their effects rather than their intentions.<\/p>\n\n\n\n<p>Benjamin contends that <strong>the combination of perceived objectivity and technical opacity makes racial discrimination under the New Jim Code harder to identify and challenge<\/strong> than many past manifestations of racism. The New Jim Code operates across virtually every domain of modern life\u2014from healthcare, education, and employment to housing, criminal justice, and social services\u2014making it potentially the most pervasive and difficult-to-challenge iteration of racism yet.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>How Does the New Jim Code Operate?<\/strong><\/h3>\n\n\n\n<p>Now that you know what the New Jim Code is, let\u2019s discuss how it works. Benjamin explains that <strong>the New Jim Code operates through four key dimensions, where <a href=\"https:\/\/www.shortform.com\/blog\/race-classification\/\">racial hierarchies<\/a> become encoded in seemingly neutral technological systems<\/strong>. These dimensions don\u2019t operate in isolation but interact with and reinforce each other, creating multiple layers through which racism becomes embedded into the technologies that shape our lives.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\"><strong>Asymmetric Visibility: The Paradox of Being Watched But Not Seen<\/strong><\/h4>\n\n\n\n<p><strong>Asymmetric visibility\u2014what Benjamin calls \u201ccoded exposure\u201d\u2014describes how technologies selectively focus attention on certain aspects of marginalized groups while rendering other aspects invisible<\/strong>. Benjamin explains that algorithms tend to amplify stereotypical views of marginalized communities\u2014often as threats or problems to be managed\u2014while simultaneously failing to recognize their individuality, humanity, and specific needs.<\/p>\n\n\n\n<p>An example of this is how content moderation algorithms on social <a href=\"https:\/\/www.shortform.com\/blog\/internet-platform\/\">media platforms<\/a> <a href=\"https:\/\/www.vox.com\/recode\/2019\/8\/15\/20806384\/social-media-hate-speech-bias-black-african-american-facebook-twitter\" target=\"_blank\" rel=\"noreferrer noopener\">disproportionately flag posts by Black users<\/a>, especially those written in African American Vernacular English, as \u201coffensive\u201d or \u201chateful.\u201d Even when the same message is posted by a Black user and a white user, the Black user\u2019s post is more likely to be removed. These algorithms, trained on data labeled by humans who bring their own biases to the task, end up amplifying racial discrimination under the guise of neutral content policies. The result is a digital environment where Black expression is hypervisible to surveillance systems as potentially problematic content, yet invisible in terms of its cultural context and value.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\"><strong>Biased Architecture: When Prejudice Is Built Into the System<\/strong><\/h4>\n\n\n\n<p><strong>Biased architecture\u2014in Benjamin\u2019s term, \u201cengineered inequity\u201d\u2014is created when technology reinforces social biases by learning from flawed data<\/strong>. If an algorithm is trained on historical data that reflects societal prejudices\u2014like the hiring records from companies that rarely hired minorities\u2014it will reproduce those patterns of discrimination. This becomes especially problematic when these algorithms make <a href=\"https:\/\/www.shortform.com\/blog\/important-decisions-in-life\/\">important decisions<\/a> about people\u2019s lives, determining who gets job interviews, loan approvals, or shorter prison sentences. The discrimination gets built right into systems that seem objective but actually perpetuate inequality.<\/p>\n\n\n\n<p>An example of this is how r\u00e9sum\u00e9-screening algorithms trained on historical hiring data often <a href=\"https:\/\/www.npr.org\/2024\/04\/11\/1243713272\/resume-bias-study-white-names-black-names\" target=\"_blank\" rel=\"noreferrer noopener\">disadvantage Black applicants<\/a>. Applicants with names suggesting they\u2019re white receive more callbacks than those with identical resumes but Black-sounding names. When these human biases become encoded in automated hiring systems that screen r\u00e9sum\u00e9s based on patterns from past hiring data, the discrimination becomes systematized and hidden behind seemingly objective technical processes. Companies using such algorithms may believe they\u2019re making \u201cdata-driven\u201d hiring decisions when, in reality, they\u2019re perpetuating historical patterns of exclusion rather than evaluating candidates based on their actual qualifications.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\"><strong>Invisible Exclusion: When \u201cNeutral\u201d Design Leaves People Out<\/strong><\/h4>\n\n\n\n<p><strong>Invisible exclusion\u2014or \u201cdefault discrimination,\u201d in Benjamin\u2019s terms\u2014occurs when technology is designed with only the dominant group in mind<\/strong>. When developers (often white men) create and test products primarily for and on people like themselves, they overlook how the technology might work differently for others. Even without intending harm, these \u201cuniversal\u201d designs inevitably function better for some people than others.&nbsp;<\/p>\n\n\n\n<p>Benjamin notes this happens in many forms of technology we use every day: For example, some public restrooms have automatic soap dispensers that don\u2019t recognize darker skin tones, and search engines show white people when you search for \u201cprofessional hairstyles.\u201d Sometimes, there are more serious consequences, like when medical devices are calibrated for lighter skin, such as pulse oximeters that give less accurate readings on darker skin and miss dangerous oxygen level drops in Black patients.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\"><strong>Disguised Control: When Surveillance Masquerades as Help<\/strong><\/h4>\n\n\n\n<p><strong>Disguised control\u2014or, as Benjamin puts it, \u201ctechnological benevolence\u201d\u2014describes how technologies that monitor, control, and restrict are often presented as helpful, progressive innovations<\/strong>. This framing makes it difficult to criticize potentially harmful systems because they come wrapped in the language of assistance and improvement.<\/p>\n\n\n\n<p>Electronic benefits transfer (EBT) cards, for example, exemplify disguised control. EBT cards, used to distribute welfare benefits like food assistance, are promoted as efficient and modern alternatives to paper vouchers or checks. However, they also function as a tool of surveillance and control\u2014governments can <a href=\"https:\/\/epicforamerica.org\/social-programs\/here-is-what-food-stamp-recipients-buy\/\" target=\"_blank\" rel=\"noreferrer noopener\">track recipients\u2019 purchases<\/a>, <a href=\"https:\/\/www.wjhl.com\/news\/local\/tennessee-bill-could-ban-candy-and-soda-from-snap-benefits\/\" target=\"_blank\" rel=\"noreferrer noopener\">restrict what they can buy<\/a>, and suspend benefits with little transparency or recourse. People who pay for food with credit cards or cash aren\u2019t subject to the same scrutiny.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>How Can We Challenge the New Jim Code?<\/strong><\/h3>\n\n\n\n<p>Benjamin doesn\u2019t just diagnose the problem of racism in technology. She also offers a path forward to a future where we design technologies that challenge rather than reinforce racial hierarchies. She recommends two ideological shifts\u2014taking a race-conscious approach to technology and shifting from <em>reform<\/em> to <em>abolition<\/em>\u2014as well as some practical strategies for changing how technology is developed. Let\u2019s explore each of her recommendations.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\"><strong>Take a Race-Conscious Approach to Technology<\/strong><\/h4>\n\n\n\n<p><strong>Benjamin argues against \u201ccolorblind\u201d approaches to technology that dominate the tech industry today<\/strong>. These approaches typically either treat race as irrelevant to technology development or focus narrowly on hiring a few people of color without changing how technologies are designed. Many companies believe that simply adding one or two Black or Brown faces to their teams \u201csolves\u201d technological racism, but they continue to build products that ignore how race shapes users\u2019 experiences.&nbsp;<\/p>\n\n\n\n<p>Meanwhile, developers often operate under the false assumption that they can build neutral tools in a biased world\u2014that if they simply ignore race in their design process, their technologies will work equally well for everyone. The result is technologies that inevitably reproduce and sometimes amplify existing inequalities.<\/p>\n\n\n\n<p><strong>Instead, Benjamin advocates for race-conscious design<\/strong>: deliberately considering how racism operates and actively working to create technologies that challenge rather than reinforce racial hierarchies. Race-conscious design involves:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Acknowledging how racism shapes data, institutions, and <a href=\"https:\/\/www.shortform.com\/blog\/social-context\/\">social contexts<\/a><\/li>\n\n\n\n<li>Examining how technologies might impact different racial groups before deployment<\/li>\n\n\n\n<li>Including diverse perspectives throughout the design process, not just as an afterthought<\/li>\n\n\n\n<li>Building tools specifically aimed at exposing and challenging racial inequities<\/li>\n\n\n\n<li>Implementing structural changes to power dynamics, including regulatory frameworks, community oversight mechanisms, and legal remedies that allow people to seek redress for algorithmic harm<\/li>\n<\/ul>\n\n\n\n<p>Benjamin highlights the <a href=\"https:\/\/www.ajl.org\/\" target=\"_blank\" rel=\"noreferrer noopener\">Algorithmic Justice League<\/a> (AJL) as an example of an organization that embodies race-conscious design principles. Founded by Joy Buolamwini after her research uncovered significant racial and <a href=\"https:\/\/www.shortform.com\/blog\/implicit-gender-bias-gender-roles-in-society\/\">gender bias<\/a> in facial recognition systems, the AJL works to highlight and address algorithmic bias through research, advocacy, and art. The organization created the Safe Face Pledge, a commitment companies could take to prohibit the use of facial recognition in weaponry or lethal systems and to increase transparency about how facial recognition is used, particularly in policing.<\/p>\n\n\n\n<p>(Shortform note: The Safe Face Pledge initiative <a href=\"https:\/\/medium.com\/@Joy.Buolamwini\/announcing-the-sunset-of-the-safe-face-pledge-36e6ea9e0dc5\" target=\"_blank\" rel=\"noreferrer noopener\">was later sunset<\/a> when it became clear that self-regulation was insufficient, as major tech companies refused to sign on despite being given a clear path to mitigate harms. This outcome demonstrates Benjamin\u2019s point that voluntary corporate commitments alone are inadequate\u2014addressing technological racism requires changes to the fundamental power structures of the tech industry, not just technical fixes or individual goodwill.)<\/p>\n\n\n\n<h4 class=\"wp-block-heading\"><strong>Shift From a Focus on Reform to Demands for Abolition<\/strong><\/h4>\n\n\n\n<p>Benjamin distinguishes between reformist approaches that try to make biased technologies slightly less discriminatory and abolitionist approaches that question whether certain technologies should exist at all in their current forms. She argues that <strong>some technologies, like predictive policing and digital surveillance, are so fundamentally embedded in systems of racial control that they cannot be reformed.<\/strong> Instead, they must be abolished and replaced with alternatives that center justice and community wellbeing.<\/p>\n\n\n\n<p>Benjamin <a href=\"https:\/\/aas.princeton.edu\/news\/shiny-high-tech-wolf-sheeps-clothing\" target=\"_blank\" rel=\"noreferrer noopener\">highlights the Appolition app<\/a> as an example of abolitionist technology that addresses injustice in the bail system. The bail system keeps people in pretrial detention simply because they can\u2019t afford to pay for their freedom. Because marginalized people are less likely to be wealthy, this system disproportionately harms communities of color. While a reformist approach might focus on creating a \u201cfairer\u201d risk assessment algorithm to determine bail amounts\u2014one that appears race-neutral but still relies on factors correlated with race like arrest history or zip code\u2014Appolition takes a fundamentally different path.<\/p>\n\n\n\n<p>The app helps people collectively pool small donations (rounded-up spare change from everyday purchases) that are sent to community bail funds. These organizations use the money to free people from pretrial detention while simultaneously working toward ending the cash bail system entirely. Each time someone bailed out through these funds returns to court without a financial <a href=\"https:\/\/www.shortform.com\/blog\/what-is-incentive-meaning-and-definition-economics\/\">incentive<\/a>, it provides evidence that the entire premise of cash bail is unnecessary. Through this dual approach\u2014providing immediate relief while building evidence against the system\u2019s necessity\u2014Appolition demonstrates how abolitionist technologies can address both symptoms and root causes simultaneously.<\/p>\n\n\n\n<h4 class=\"wp-block-heading\"><strong>Make Practical Changes to the Existing Tech Industry<\/strong><\/h4>\n\n\n\n<p>In addition to ideological shifts like a race-conscious approach and an abolitionist framework, Benjamin suggests practical changes we could implement within the existing tech industry to mitigate the racist <a href=\"https:\/\/www.shortform.com\/blog\/negative-impact-of-technology-on-society\/\">impacts of technology<\/a>.<\/p>\n\n\n\n<h5 class=\"wp-block-heading\">Diversify the Tech Workforce<\/h5>\n\n\n\n<p>First, Benjamin argues that who builds technology matters. <strong>The homogeneity of the tech workforce\u2014which consists mainly of white and Asian men\u2014contributes to blind spots<\/strong> in design and development. As we\u2019ve discussed, many companies try to address this via superficial diversity initiatives, where they hire a few people from underrepresented groups. However, Benjamin argues these initiatives are inadequate because they place the burden of \u201cfixing\u201d bias on minority employees without changing underlying power dynamics. For example, a Black engineer might flag a potentially discriminatory feature, but if they don\u2019t have the authority to change the project\u2019s course, their insight will remain unheeded.&nbsp;<\/p>\n\n\n\n<p>In contrast, Benjamin says <strong>meaningful diversity entails creating conditions where everyone can shape the technologies being built.<\/strong> This requires:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Increasing representation at all levels, especially in leadership and <a href=\"https:\/\/www.shortform.com\/blog\/methods-of-decision-making-crucial-conversations\/\">decision-making<\/a> positions<\/li>\n\n\n\n<li>Creating inclusive environments where diverse perspectives are valued and can influence product decisions<\/li>\n\n\n\n<li>Addressing structural barriers to entry and advancement<\/li>\n\n\n\n<li>Compensating people for expertise drawn from lived experience, not just technical credentials<\/li>\n<\/ul>\n\n\n\n<h5 class=\"wp-block-heading\">Audit Technologies for Bias<\/h5>\n\n\n\n<p>Second, <strong>Benjamin advocates for rigorous testing and auditing of technologies for discriminatory impacts before deployment<\/strong>. Effective auditing includes:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Testing with diverse populations and in diverse contexts<\/li>\n\n\n\n<li>Examining training data for historical biases<\/li>\n\n\n\n<li>Analyzing outcomes across different demographic groups<\/li>\n\n\n\n<li>Continuous monitoring for unexpected discriminatory effects<\/li>\n<\/ul>\n\n\n\n<p>She emphasizes that <strong>these audits must have teeth: They must lead to substantive changes when bias is found.<\/strong> For example, when facial recognition technology shows higher error rates for darker-skinned faces, a superficial adjustment might be to simply add a disclaimer about potential inaccuracy. A substantive change would be redesigning the system with more diverse training data or even delaying deployment until acceptable accuracy across all demographics is achieved. The difference is whether the burden of accommodating the flawed technology falls on the technology itself or on those it misidentifies.<\/p>\n\n\n\n<h5 class=\"wp-block-heading\">Involve Communities in Design<\/h5>\n\n\n\n<p>Third, Benjamin explains that technologies shouldn\u2019t be developed in isolation from the communities they affect. <strong>She advocates for participatory design\u2014an approach that directly involves the people who will be impacted by a technology in its creation process<\/strong>. This looks like:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Consulting community members at the earliest stages to determine whether a technology is even needed<\/li>\n\n\n\n<li>Compensating community members for their expertise<\/li>\n\n\n\n<li>Giving communities real decision-making power throughout development<\/li>\n\n\n\n<li>Evaluating success based on community-defined metrics<\/li>\n<\/ul>\n\n\n\n<p>For example, consider predictive policing software that directs police to certain neighborhoods based on historical crime data, without those neighborhoods\u2019 input. A participatory alternative to predictive policing would begin by asking residents about their safety priorities. This could lead to entirely different technologies\u2014like community-run emergency networks that connect people with mental health professionals, or digital platforms for coordinating neighborhood watch programs, mutual aid networks, and restorative justice initiatives. These alternatives would address real community needs (safety, support, and connection) rather than impose surveillance- and punishment-focused solutions.<\/p>\n\n\n\n<h5 class=\"wp-block-heading\">Establish Regulatory Frameworks<\/h5>\n\n\n\n<p>Finally, while Benjamin focuses primarily on how technology is designed and developed, <strong>she also notes the importance of external oversight<\/strong>. Effective regulatory frameworks might include:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Legal remedies for those harmed by <a href=\"https:\/\/www.shortform.com\/blog\/algorithmic-discrimination\/\">algorithmic discrimination<\/a><\/li>\n\n\n\n<li>Transparency requirements that make <a href=\"https:\/\/www.shortform.com\/blog\/automated-decision-making\/\">automated decision-making<\/a> processes understandable to those affected<\/li>\n\n\n\n<li>Limits on the use of certain technologies in high-risk contexts, such as criminal justice or housing<\/li>\n<\/ul>\n\n\n\n<p>These external guardrails create accountability when internal processes fail to prevent racist impacts. For example, people denied housing based on algorithmic assessments should have the right to understand the factors that influenced that decision and to challenge potentially discriminatory outcomes. This contrasts with the current black-box nature of many algorithmic systems, where those affected have no way to understand or contest decisions that impact their lives.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Is racism embedded in the digital technologies that increasingly shape our daily lives? Do algorithms and AI systems perpetuate centuries-old patterns of discrimination? Ruha Benjamin&#8217;s Race After Technology: Abolitionist Tools for the New Jim Code addresses these urgent questions. Benjamin&#8217;s work aims to show how seemingly neutral digital systems\u2014from hiring software to healthcare algorithms\u2014actually amplify racial inequalities in new and often invisible ways. Keep reading for an overview of this thought-provoking book.<\/p>\n","protected":false},"author":9,"featured_media":144775,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[40,160,24],"tags":[1804],"class_list":["post-144770","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-books","category-science","category-society","tag-race-after-technology","","tg-column-two"],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v24.3 (Yoast SEO v24.3) - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Ruha Benjamin&#039;s Race After Technology: Book Overview - Shortform Books<\/title>\n<meta name=\"description\" content=\"In Race After Technology: Abolitionist Tools for the New Jim Code, Ruha Benjamin argues that digital systems amplify racism. Learn more.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Ruha Benjamin&#039;s Race After Technology: Book Overview\" \/>\n<meta property=\"og:description\" content=\"In Race After Technology: Abolitionist Tools for the New Jim Code, Ruha Benjamin argues that digital systems amplify racism. Learn more.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/\" \/>\n<meta property=\"og:site_name\" content=\"Shortform Books\" \/>\n<meta property=\"article:published_time\" content=\"2025-07-01T16:42:18+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-04-26T18:52:35+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/07\/inside-book-store.webp\" \/>\n\t<meta property=\"og:image:width\" content=\"1344\" \/>\n\t<meta property=\"og:image:height\" content=\"768\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/webp\" \/>\n<meta name=\"author\" content=\"Elizabeth Whitworth\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Elizabeth Whitworth\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"15 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/\"},\"author\":{\"name\":\"Elizabeth Whitworth\",\"@id\":\"https:\/\/www.shortform.com\/blog\/#\/schema\/person\/d2928cf6c11a69ced1491d6a5b74fb13\"},\"headline\":\"Ruha Benjamin&#8217;s Race After Technology: Book Overview\",\"datePublished\":\"2025-07-01T16:42:18+00:00\",\"dateModified\":\"2026-04-26T18:52:35+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/\"},\"wordCount\":3295,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/#organization\"},\"image\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/07\/inside-book-store.webp\",\"keywords\":[\"Race After Technology\"],\"articleSection\":[\"Books\",\"Science\",\"Society\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/\",\"url\":\"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/\",\"name\":\"Ruha Benjamin's Race After Technology: Book Overview - Shortform Books\",\"isPartOf\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/07\/inside-book-store.webp\",\"datePublished\":\"2025-07-01T16:42:18+00:00\",\"dateModified\":\"2026-04-26T18:52:35+00:00\",\"description\":\"In Race After Technology: Abolitionist Tools for the New Jim Code, Ruha Benjamin argues that digital systems amplify racism. Learn more.\",\"breadcrumb\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#primaryimage\",\"url\":\"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/07\/inside-book-store.webp\",\"contentUrl\":\"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/07\/inside-book-store.webp\",\"width\":1344,\"height\":768,\"caption\":\"The interior of a book store with a cityscape outside the large windows\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.shortform.com\/blog\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Ruha Benjamin&#8217;s Race After Technology: Book Overview\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.shortform.com\/blog\/#website\",\"url\":\"https:\/\/www.shortform.com\/blog\/\",\"name\":\"Shortform Books\",\"description\":\"The World&#039;s Best Book Summaries\",\"publisher\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.shortform.com\/blog\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.shortform.com\/blog\/#organization\",\"name\":\"Shortform Books\",\"url\":\"https:\/\/www.shortform.com\/blog\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.shortform.com\/blog\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2019\/06\/logo-equilateral-with-text-no-bg.png\",\"contentUrl\":\"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2019\/06\/logo-equilateral-with-text-no-bg.png\",\"width\":500,\"height\":74,\"caption\":\"Shortform Books\"},\"image\":{\"@id\":\"https:\/\/www.shortform.com\/blog\/#\/schema\/logo\/image\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.shortform.com\/blog\/#\/schema\/person\/d2928cf6c11a69ced1491d6a5b74fb13\",\"name\":\"Elizabeth Whitworth\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.shortform.com\/blog\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/1fff9d65a52ac4340660218e7b63ee5e365cf08e7aa7adff79a0142cd4b96f84?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/1fff9d65a52ac4340660218e7b63ee5e365cf08e7aa7adff79a0142cd4b96f84?s=96&d=mm&r=g\",\"caption\":\"Elizabeth Whitworth\"},\"description\":\"Elizabeth has a lifelong love of books. She devours nonfiction, especially in the areas of history, theology, and philosophy. A switch to audiobooks has kindled her enjoyment of well-narrated fiction, particularly Victorian and early 20th-century works. She appreciates idea-driven books\u2014and a classic murder mystery now and then. Elizabeth has a Substack and is writing a book about what the Bible says about death and hell.\",\"sameAs\":[\"rina@shortform.com\"],\"award\":[\"Contributions to joint task force efforts (FBI)\",\"Contributions to Special Operations Division (DOJ & DEA)\",\"Efforts to fight the war on drugs (NSA)\",\"Contributions to Operation Storm Front (US Customs Service)\"],\"knowsAbout\":[\"History\",\"Theology\",\"Government\"],\"jobTitle\":\"Senior SEO Writer\",\"worksFor\":\"Shortform\",\"url\":\"https:\/\/www.shortform.com\/blog\/author\/elizabeth\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Ruha Benjamin's Race After Technology: Book Overview - Shortform Books","description":"In Race After Technology: Abolitionist Tools for the New Jim Code, Ruha Benjamin argues that digital systems amplify racism. Learn more.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/","og_locale":"en_US","og_type":"article","og_title":"Ruha Benjamin's Race After Technology: Book Overview","og_description":"In Race After Technology: Abolitionist Tools for the New Jim Code, Ruha Benjamin argues that digital systems amplify racism. Learn more.","og_url":"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/","og_site_name":"Shortform Books","article_published_time":"2025-07-01T16:42:18+00:00","article_modified_time":"2026-04-26T18:52:35+00:00","og_image":[{"width":1344,"height":768,"url":"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/07\/inside-book-store.webp","type":"image\/webp"}],"author":"Elizabeth Whitworth","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Elizabeth Whitworth","Est. reading time":"15 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#article","isPartOf":{"@id":"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/"},"author":{"name":"Elizabeth Whitworth","@id":"https:\/\/www.shortform.com\/blog\/#\/schema\/person\/d2928cf6c11a69ced1491d6a5b74fb13"},"headline":"Ruha Benjamin&#8217;s Race After Technology: Book Overview","datePublished":"2025-07-01T16:42:18+00:00","dateModified":"2026-04-26T18:52:35+00:00","mainEntityOfPage":{"@id":"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/"},"wordCount":3295,"commentCount":0,"publisher":{"@id":"https:\/\/www.shortform.com\/blog\/#organization"},"image":{"@id":"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#primaryimage"},"thumbnailUrl":"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/07\/inside-book-store.webp","keywords":["Race After Technology"],"articleSection":["Books","Science","Society"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/","url":"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/","name":"Ruha Benjamin's Race After Technology: Book Overview - Shortform Books","isPartOf":{"@id":"https:\/\/www.shortform.com\/blog\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#primaryimage"},"image":{"@id":"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#primaryimage"},"thumbnailUrl":"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/07\/inside-book-store.webp","datePublished":"2025-07-01T16:42:18+00:00","dateModified":"2026-04-26T18:52:35+00:00","description":"In Race After Technology: Abolitionist Tools for the New Jim Code, Ruha Benjamin argues that digital systems amplify racism. Learn more.","breadcrumb":{"@id":"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#primaryimage","url":"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/07\/inside-book-store.webp","contentUrl":"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/07\/inside-book-store.webp","width":1344,"height":768,"caption":"The interior of a book store with a cityscape outside the large windows"},{"@type":"BreadcrumbList","@id":"https:\/\/www.shortform.com\/blog\/ruha-benjamin-race-after-technology\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.shortform.com\/blog\/"},{"@type":"ListItem","position":2,"name":"Ruha Benjamin&#8217;s Race After Technology: Book Overview"}]},{"@type":"WebSite","@id":"https:\/\/www.shortform.com\/blog\/#website","url":"https:\/\/www.shortform.com\/blog\/","name":"Shortform Books","description":"The World&#039;s Best Book Summaries","publisher":{"@id":"https:\/\/www.shortform.com\/blog\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.shortform.com\/blog\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.shortform.com\/blog\/#organization","name":"Shortform Books","url":"https:\/\/www.shortform.com\/blog\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.shortform.com\/blog\/#\/schema\/logo\/image\/","url":"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2019\/06\/logo-equilateral-with-text-no-bg.png","contentUrl":"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2019\/06\/logo-equilateral-with-text-no-bg.png","width":500,"height":74,"caption":"Shortform Books"},"image":{"@id":"https:\/\/www.shortform.com\/blog\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/www.shortform.com\/blog\/#\/schema\/person\/d2928cf6c11a69ced1491d6a5b74fb13","name":"Elizabeth Whitworth","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.shortform.com\/blog\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/1fff9d65a52ac4340660218e7b63ee5e365cf08e7aa7adff79a0142cd4b96f84?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/1fff9d65a52ac4340660218e7b63ee5e365cf08e7aa7adff79a0142cd4b96f84?s=96&d=mm&r=g","caption":"Elizabeth Whitworth"},"description":"Elizabeth has a lifelong love of books. She devours nonfiction, especially in the areas of history, theology, and philosophy. A switch to audiobooks has kindled her enjoyment of well-narrated fiction, particularly Victorian and early 20th-century works. She appreciates idea-driven books\u2014and a classic murder mystery now and then. Elizabeth has a Substack and is writing a book about what the Bible says about death and hell.","sameAs":["rina@shortform.com"],"award":["Contributions to joint task force efforts (FBI)","Contributions to Special Operations Division (DOJ & DEA)","Efforts to fight the war on drugs (NSA)","Contributions to Operation Storm Front (US Customs Service)"],"knowsAbout":["History","Theology","Government"],"jobTitle":"Senior SEO Writer","worksFor":"Shortform","url":"https:\/\/www.shortform.com\/blog\/author\/elizabeth\/"}]}},"jetpack_sharing_enabled":true,"jetpack_featured_media_url":"https:\/\/www.shortform.com\/blog\/wp-content\/uploads\/2025\/07\/inside-book-store.webp","_links":{"self":[{"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/posts\/144770","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/users\/9"}],"replies":[{"embeddable":true,"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/comments?post=144770"}],"version-history":[{"count":6,"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/posts\/144770\/revisions"}],"predecessor-version":[{"id":148601,"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/posts\/144770\/revisions\/148601"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/media\/144775"}],"wp:attachment":[{"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/media?parent=144770"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/categories?post=144770"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.shortform.com\/blog\/wp-json\/wp\/v2\/tags?post=144770"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}