How Inequity Is Coded Into The Surveillance Tech Monitoring Black Communities

Oct 16, 2025 - 12:00
 0  1
How Inequity Is Coded Into The Surveillance Tech Monitoring Black Communities
Business audit and compliance concept. Business professional working on laptop with audit and compliance icons, representing financial auditing, risk management, quality control, corporate governance,
Source: Sandwish / Getty

In an age of persistent data collection, it’s easy to reminisce about when the internet felt more open, anonymous, and unregulated. We ventured deep into the cyber wilds of viral memes, early internet forums, chatrooms, Flash games, and social media sites that once had fewer guardrails, less tracking, and as much anonymity as we desired. 

It’s not an accident that the internet of today feels different. With calls for age-verification requirements across the web, today’s internet is living up to its roots as an expansion of a civilian surveillance network with tendrils reaching into our smartphones, classrooms, connected home devices, and even our streets and neighborhoods. 

However normalized data collection has become, do we actually have an understanding of the recent developments in mass digital surveillance and how that impacts the safety of Black, Brown, and Muslim communities as we navigate unmasked authoritarianism? It also begs the question, where did digital surveillance originate, and in what ways has it evolved?

ARPANET: The Pentagon’s Tool Against Black Political Movements

Modern-day American privacy is envisioned through the lens of government agencies, corporations, and data brokers monitoring and storing what we say and do online. It’s also understood that people of color have disproportionately been the victims of discriminatory surveillance due to systemic poverty, racial profiling, and the political movements that push back against those injustices. So, before we had the internet as we know it, how was surveillance achieved?

Journalist and reporter Yasha Levine offers us a view into the development of the early internet in his book Surveillance Valley. In it, he details how the 1960s internet, known as ARPANET, was built by the Pentagon for military use. The network was later adapted into a civilian network that grew into the internet we know today. 

Levine’s research found that the driving force for developing this military network was to create “computer systems that could collect and share intelligence, watch the world in real time, and study and analyze people and political movements with the ultimate goal of predicting and preventing social upheaval… In other words, the internet was hardwired to be a surveillance tool from the start.” 

Using this system, the U.S. Army was able to treat activists and protesters, including groups like the Black Panthers, as “enemy combatants embedded within the indigenous population.” They created color-coded maps that marked “Negro neighborhoods,” gathered intelligence on Black Panther weapon arsenals, and even collected data on the private lives of a large number of American citizens. It was a coordinated authoritarian response to civil rights and anti-war movements of the time that threatened to mobilize people en masse against the established order. Sound familiar?

Silicon Valley: Creating The Eyes And Ears Of The State

Today, we see the maturation of ARPANET with the Silicon Valley-driven suppression and censorship of activists and protestors on social media during mass political movements, including the Black Lives Matter movement and current-day pro-Palestinian activism, which runs on conflating peaceful protest with violent extremism. Coinciding with the coronavirus pandemic, the Black Lives Matter political moment also happened at a time when US tech companies were investigating various kinds of surveillance tech to stop the spread of COVID-19, which helped normalize the idea that our smartphones and other devices could track where we go and who we come into contact with at all times.

We’re now living in a reality in which the U.S. government, with support from tech companies like Palantir, is aiming to use that same technology to consolidate data and create detailed dossiers on both U.S. citizens and immigrants that could lead to a nationwide rise of algorithmic, AI-powered predictive policing like what’s currently in use by the LAPD. 

Public schools around the country are also adopting AI technologies like facial recognition and predictive analytics in the interest of shaping K-12 education, but education data reporter Quintessa Williams points out the risk that these tools are being deployed disproportionately in Black and low-income schools, quietly expanding the school-to-prison pipeline. She reports that91 percent of public schools use security cameras, while more than 80 percent monitor students’ online activity. Yet there is little evidence that these tools improve safety — and even less to show they’ve been tested for bias.” 

Additionally, as the U.S. Immigration and Customs Enforcement (ICE) pays Palantir millions to update the technical infrastructure underpinning mass deportation efforts, Black immigrant students are living in a climate of fear due to increased ICE activity in schools, compounding the existing challenges of anti-Blackness, bullying, and bias they experience.

Systemic Surveillance And Poverty: The Digital Poorhouse

The evolution of ARPANET is also visible in how Americans are surveilled with technology designed to analyze and make decisions about the lives of the poor and working class. In her book titled Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, political scientist and author Virginia Eubanks refers to this system as a digital poorhouse—a collection of algorithms, databases, and risk models intended to punitively track the behavior and spending habits of those living in poverty to absolve the state of the responsibility to eradicate it. 

This modern system of automated decision-making, data mining, and predictive analytics entraps the poorest and most vulnerable, especially recipients of public assistance and welfare, within our communities. In her words, it “deters the poor from accessing public resources; polices their labor, spending, sexuality, and parenting; tries to predict their future behavior; and punishes and criminalizes those who do not comply with its dictates.” 

Digital Sovereignty

The political movements of the past decade still have a long way to go to adequately address the role of data mining and digital surveillance systems in civilian death, perpetuating violence, exposing migrants to harassment and unlawful ICE detention, deepening class divides, and supporting mass incarceration. How might we have approached digital sovereignty differently if we knew that the internet would eventually lead to such an invasive culture of observation under the guise of child safety, personalization, and convenience? A course-correction might first require us to notice how many devices around us are continuously collecting data about the most intimate details of our lives and demanding to know who has access to it.

Joude Ellis is a communicator and cultural organizer based in NY whose writing prioritizes radical ideas that challenge mainstream narratives and help platform Black and other marginalized perspectives.

SEE ALSO:

10 Billionaires Actively Harming Black, Marginalized Communities

Black Man Falsely Arrested Based On AI Tech

Facial Recognition Algorithm Falsely Accuses Black Man

Post-9/11 Black Communities Still Under Surveillance

Shady Algorithms Helping Cops Lock Up Black Folks

‘Coded Bias’ Documentary Uncovers Racial Bias In Technology

What's Your Reaction?

Like Like 0
Dislike Dislike 0
Love Love 0
Funny Funny 0
Angry Angry 0
Sad Sad 0
Wow Wow 0