How Racism is Perpetuated within Social Media and Artificial IntelligenceAt social media platforms like LinkedIn, Instagram and TikTok, anti-racist policy enforcement has turned into a mechanism to uphold white supremacy. Along the same lines for some artificial intelligence applications, white-centric data inputs skew user experience and limit our imagining of a more inclusive world.

ByKelly Campbell

Opinions expressed by Entrepreneur contributors are their own.

Social technology and advertising giants celebrate heritage and history observances and make other public commitments to anti-racism, but I've seen first-hand just how performative all of this is at its core. The executive leaders of platforms likeLinkedIn, Meta (Instagram) andByteDance (TikTok)fail to uphold the anti-racist policies touted on their websites and in PR statements. Evenartificial intelligenceapplications reflect the supremacist values of our society, especially in the United States.

Social media platforms prioritize whiteness

There's a new movement that imploreswhite women to do moreto confrontracism. Saira Rao, who is South Asian, and Regina Jackson, who is Black, are co-authors of a book on this topic, co-founders of Race2Dinner and have also co-produced the new documentary, "Deconstructing Karen." Days after we connected on LinkedIn, Saira's entire profile disappeared, as though she had never existed on the platform. Why would LinkedIn ban a New York Times bestselling author?

LinkedIn'spolicyprohibits naming a group of people in posts (especially "white people" and "white women"), or using terms like racism or racist, among others. Saira posted about her book, "White Women," but LinkedIn's algorithm flagged it as a breach of policy in that her use of the phrase was considered a form ofbullyingand harassment.

This happens daily to creators of color, and it's why you've likely seen many posts that use abbreviations like "yt women" or special characters to break up words like "rac.ism." Ironically, the policy put in place to protect against hateful language is the very mechanism that gets Black, Brown, Indigenous and LGBTQIA+ creators regularly banned when they attempt to surface the racism, xenophobia, homophobia, transphobia and misogyny they experience.

Social media platforms seem to be institutions of the supremacy mindset that penalizes people of color who are vocal about racism and xenophobia. Speaking out about racism in the workplace typically equates to some level of retaliation, including being ghosted, demoted, left out of meetings and off email threads, or even terminated. The powers that be at LinkedIn, Meta, Twitter and TikTok do the same thing — in that they ban,shadowbanor outright delete the accounts of Black and Brown creators.

Unlearning anddismantling racismrequires that we talk about it openly in both public and private spaces. If social media corporations continue to penalize anyone who holds white men and women accountable for their racist commentary, how can we move toward belonging, equity andinclusionas a society?

Related:How Can You Start Shifting Your Business to Be Actively Anti-Racist?

Human issues with artificial intelligence

Social channels are not the only place where algorithmic technology both breeds and perpetuates racism. It happens on the results pages of every major search engine and within technology applications, both online and off.

My partner and I were at The Dali Museum in St. Petersburg, Florida, a few weeks ago. As part of a special exhibit called "The Shape of Dreams," advertising agency Goodby Silverstein & Partners (GS&P) created "Dream Tapestry'',an interactive art installation powered by DALL-E— an artificial intelligence (AI) program that generates images using a dataset of text–image pairs from the internet in response to visitors' descriptions of their dreams, called "prompts." It's a deep learning model using Google's Imagen software andOpenAI, a start-up backed by Microsoft.

Since the installation accommodated only six people at a time, my partner and I, both white and queer, entered with two Black couples. Standing at individual kiosks, all six of us entered our dream descriptors. The AI digested our inputs and rendered images on the screens before us, pulling from Surrealist and Symbolistic images.

Image credit: Kelly Campbell

Then, the AI stitched together all six of our dream renderings on a giant board. We viewed the combination of our dreams as one cohesive work of art and downloaded our own rendering, as well as the tapestry of all six that the AI had generated. My partner and I left feeling that it was undoubtedly worth the length of time we stood in line.

On the flight back home, we reviewed her rendering, then mine. We were amazed by how similar they were despite the phrases we entered being so different. We then looked at the tapestry and noted that all four of the other renderings contained groups of colonials. None of the people in our group's dreams were Black, meaning that the AI assumed that all subjects were white and/or its database contained no text-image pairs of Black people or from Black artists. Neither of us could know for sure, but we were willing to bet that our group wasn't collectively dreaming of white men.

AI, like any other algorithm-based tech output, is only as accurate as its data input. Even with a high degree of granularity, the outputs default to categorizing "white as the norm."

Related:Why Are Some Bots Racist? Look at the Humans Who Taught Them.

When mostcorporate leaders和应用程序开发人员cisgender, heterosexual, white men, the lens through which databases are created and filtered is, therefore, also cis/het/white. Therein lies the problem.

White leaders must expand their conscious awareness of the power they wield and the opportunity they have to right the wrongs of their past and present — starting with equal representation, listening to the lived experiences of people of the global majority (PGM), andgetting comfortablewith uncomfortable conversations.

You might ask, "Were there any Black Surrealist or Symbolic artists or images that depicted Black or Brown people during that era?" The answer is a resounding yes. Looking at thedream gallery online, an entire history seems to have been excluded from the database, imagery from African and African-Caribbean artists of the same era, categorized asAfro-Surrealism, as well asthe Négritude movement in 1930's France.

While the capability of DALL-E seems magical, I imagine we can do better than excluding Black and Brown artists and subjects. Through this exclusion, the AI shapes the narrative that we are collectively dreaming of a world solely comprised of white bodies.

Related:5 Reasons Leaders Fail to Transform DEI Rhetoric into Action

当杰夫•Goobdy cis /白色的联合创始人和合作chairman of GS&P, talks about The Dream Tapestry, he refers to the power of the AI to reflect to us what we're dreaming as a nation, or even globally, at this precise time in history. If the goal of DALL-E is to create a collective image of what we dream as a whole, it would seem that there's an opportunity to depict the world that many of us want to live in, dream about living in — one that is diverse, equitable, inclusive and provides a sense of belonging for all genders and races.

Before LinkedIn bans another PGM and this installation makes its way into another museum, could we take an empathetic step back to understand how a lack of BIPOC representation reinforces a supremacy mindset and keeps us from truly seeing each other's humanity?

Wavy Line
Kelly Campbell

Entrepreneur Leadership Network Contributor

Trauma-Informed Conscious Leadership Coach

Kelly Campbell is a Trauma-Informed Conscious Leadership Coach to self-aware visionaries. A keynote speaker and host of THRIVE: Your Agency Resource, Kelly is also founder of Consciousness Leaders, the world's most diverse representation agency. Their book, Heal to Lead, will be released April 2024.

Editor's Pick

Related Topics

Branding

Are You Protecting your Brand with a Federal Trademark? Here's How to Get Started

Your business' brand is one of your most valuable assets -- and it can be protected more easily than you may think.

Thought Leaders

I Pitched 300 People a Day For 1 Year — and Learned This Impactful Entrepreneurial Lesson

After working myself to the bone pitching 300 people each day for one year, I came out of that experience as a new man — but surprisingly, an unhappier one. Here's what I learned.

Business Ideas

The Top 10 Home Business Ideas for 2023

Can't figure out which enterprise you should launch in 2023? Check out 10 stellar home business ideas to get inspiration.

Business News

Doctor's Office Receptionist Arrested for Allegedly Stealing $44,000 From Patients in Square Payment Scam

According to police, the receptionist stole from over 75 patients.

Business Models

Tap Into Boundless Success Potential With These Remote Business Ideas

Are you tired of getting up in the morning, getting in your cold car, and driving to work? Then don't. Check out these remote business ideas.