Nicole Cardoza Nicole Cardoza Nicole Cardoza Nicole Cardoza

Understand the TikTok strike.

Over the past month, many Black social media creators organized a strike to stop creating and posting dance choreography on the social media app TikTok. The social media app is built around reposting and remixing content from other creators, and a popular feature is learning and recording dances to trending songs. When Black female rapper Megan Thee Stallion released her new song, “Thot Shit,” on June 11, many Black creators agreed not to create choreography.

Good morning and happy Wednesday! Don't overlook the Black creator TikTok strike – it may read as petty social media drama on the surface, but this organized response is a larger declaration for respect and representation in the growing creator economy. Learn more in today's newsletter! And follow us on TikTok if you haven't already.

Thank you for your support! This daily, free, independent newsletter is made possible by your support. Consider making a donation to support our work. You can start a monthly subscription on Patreon or our website, or give one-time using our websitePayPal, or Venmo (@nicoleacardoza).

– Nicole


TAKE ACTION


  • Learn more about the Black TikToker Strike by following the hashtag #BlackTikTokStrike.

  • Support marginalized creators on social media: use more engagement tools on posts you see from creators you enjoy. Like, comment, share, and save the images and videos that they post.

  • Understand how strikes work and the best way to support them.

  • Consider: What do you know about the origins of your favorite digital trends? I.e your favorite gif, TikTok dance, or meme?


GET EDUCATED


By Nicole Cardoza (she/her)

Over the past month, many Black social media creators organized a strike to stop creating and posting dance choreography on the social media app TikTok. The social media app is built around reposting and remixing content from other creators, and a popular feature is learning and recording dances to trending songs. When Black female rapper Megan Thee Stallion released her new song, “Thot Shit,” on June 11, many Black creators agreed not to create choreography. Ironically, the music video for the song in question centers women of color as essential workers and highlights the type of hostility that Black creators experience online.

Get a 1-min breakdown of the issue on the ARD TikTok >

This is because of a growing conversation around compensation and equity for Black people on TikTok. Black creators often are behind the TikTok trends that go viral, but rarely gain recognition; white TikTok users are oftentimes miscredited as creators and gain sponsorships and media recognition (Teen Vogue). Black creators have also been vocal algorithmic censorship of content related to Black Lives Matter last summer, which further increased racial disparities of who’s celebrated on the platform (Time).

But this isn’t a TikTok-specific issue. Much of popular culture today leans heavily on language, dance, and other cultural cues taken directly from the Black community – particularly the Black LGBTQ+ community. From dances to hairstyles, phrases, and music, dominant culture often adopts Black culture and makes it mainstream. And white people, who benefit from more power and privilege in our society, are more likely to gain recognition for echoing these cultural acts – even if they had no hand in creating them. Learn more in a previous newsletter.

Moreover, the Black community still has to fight for their cultural markers to be accepted within culture at the same time as those with power and privilege enjoy them. Consider recent initiatives to allow natural hairstyles in schools (Chalkbeat), or the fight to normalize AAVE as a valid vernacular (Black Youth Project). With this context, it’s clear how a strike on short dance choreography reflects a broader stance on the cultural appropriation of Black culture throughout history.


It’s also important to recognize the role of withholding labor in the history of Black movements. Black people have gone on strike by withholding labor to extract fair compensation since before the Civil War. Consider the Great Railroad Strike of 1877, where over 100,000 railroad workers halted trains and stopped working for over two months in pursuit of better wages and conditions. There’s also the Memphis Sanitation Strike of 1968, where 1,300 Black workers walked off the job, demanding that the city recognize their union, increase wages, and end inhumane conditions. As garbage stacked up across the city streets, the workers never relented, attracting the support of Dr. Martin Luther King, Jr., who visited to show support and delivered his famous “Mountaintop” speech. Learn more in a previous newsletter. And just last year, when players from major league sports stopped playing for 48 hours after the shooting of Jacob Bake, the world took note – and fundamentally shifted how sports leagues respond to social issues (Vox). Their efforts – alongside other labor strikes led by other people of color – didn’t just raise awareness of critical issues, but carved a path for more equitable practices in labor unions altogether (Teen Vogue).

You can argue that TikTok influencers aren’t exactly the same type of wage workers who took part in past strikes. But let’s not overlook the influence of the “creator economy” and those that lead it. As digital communities flourish, nearly 50M people around the world consider themselves creators and receive some type of compensation from their work (Forbes). Creators offer a ton of value by creating content and community that might be inaccessible otherwise, particularly those from marginalized communities that offer an alternative to what’s mainstream. But being a creator is a difficult job with little infrastructure or safety (Teen Vogue). It’s powerful to see creators withholding their labor without that type of support behind them, and advocate for more equitable practices for this burgeoning labor market.

Perhaps this strike will encourage everyone that enjoys content online to reflect and consider: how do we value the creators of the content we consume? What labor may we take for granted – both online and off? And how can the strikes of the past transform our future?


Key Takeaways


Black creators on TikTok are on strike to take a stance against cultural appropriation and lack of credit for the choreography they introduce to the platform

  • Strikes throughout history have been a powerful way to shape perceptions about labor and value

  • Popular culture is rooted in Black cultural markers, but rarely celebrates or protects those that create it


RELATED ISSUES



PLEDGE YOUR SUPPORT


Thank you for all your financial contributions! If you haven't already, consider making a monthly donation to this work. These funds will help me operationalize this work for greatest impact.

Subscribe on Patreon Give one-time on PayPal | Venmo @nicoleacardoza

Read More
Nia Norris Nicole Cardoza Nia Norris Nicole Cardoza

Uncover racial bias in photography.

Cameras have been historically calibrated for lighter skin. When color film was developed, the first model to pose for camera calibration in photo labs was a woman named Shirley Page. After that, all color calibration cards were nicknamed “Shirley cards.” For decades, the “Shirley cards” featured only white women and were labeled “normal.” It wasn’t until the 1970s that Kodak started testing cards with Black women (NPR). They released Kodak GoldMax, a film advertised as being able to photograph “a dark horse in low light” – a thinly veiled promise of being able to capture subjects of color in a flattering way (NYTimes).

Good morning and happy Tuesday! When the YouTube iOS app was first released, about 10% of users were somehow uploading their videos upside-down. Engineers were puzzled until they took a closer look – they had inadvertently designed the app for right-handed users only. Phones are rotated 180 degrees in left-handed users' hands, and because the team was predominantly right-handed, this flaw missed internal testing (Google).

This unconscious bias is prevalent in much of the technology we use right now. Today, Nia outlines the role that bias has played in the history of photography technology.

Thank you for keeping this independent platform going. In honor of our anniversary, become a monthly subscriber on our website or Patreon this week and we'll send you some swag! You can also give one-time on Venmo (@nicoleacardoza), PayPal or our website.

– Nicole


TAKE ACTION


  • Read about the exclusive history of photography, lack of diversity at tech companies, and racial bias in their products today.

  • If you are a STEM employer, ensure that you are hiring people of color for the development of new technology.

  • Buy technology from companies that are actively working to develop more inclusive hardware and software.


GET EDUCATED


By Nia Norris (she/her)

The word inclusivity may not immediately come to mind when we think about camera design. After all, cameras do the job they have been doing for years: they capture the image in front of them so that we can keep a piece of the moment we are capturing. However, if you have noticed that often it is harder to take photos of more melanated individuals, you might be onto something. Google and Snapchat both recently announced that they are redesigning their cameras to be more inclusive to individuals who have darker skin (The VergeMuse). But what does this mean?

Cameras have been historically calibrated for lighter skin. When color film was developed, the first model to pose for camera calibration in photo labs was a woman named Shirley Page. After that, all color calibration cards were nicknamed “Shirley cards.” For decades, the “Shirley cards” featured only white women and were labeled “normal.” It wasn’t until the 1970s that Kodak started testing cards with Black women (NPR). They released Kodak GoldMax, a film advertised as being able to photograph “a dark horse in low light” – a thinly veiled promise of being able to capture subjects of color in a flattering way (NYTimes).

Although digital photography has led to some advancements, like dual skin-tone color balancing, it can still be a challenge to photograph individuals with a darker skin tone in artificial light. There are special tricks that cinematographers and photographers use for shooting darker skin despite these technological limitations, such as using a reflective moisturizer (NYTimes). Snapchat’s camera filters have been criticized as “whitewashed,” with Black individuals pointing out that the Snapchat camera makes their faces look lighter (The Cut). Snapchat has also released culturally insensitive camera filters including a Juneteenth filter encouraging users to “break the chains” and a Bob Marley filter that amounted to digital blackface (Axios).

After taking heat for digital whitewashing, Snapchat has enlisted the help of Hollywood directors of photography to create what they are calling an “inclusive camera” led by software engineer Bertrand Saint-Preaux to hopefully ease the dysphoria that Black users may feel after taking selfies through the app. Some of these efforts include adjusting camera flash and illumination variations in order to produce a more realistic portrait of users of color (Muse). Similarly, Google is changing its auto-white balancing and algorithms for the Pixel camera. They’re also creating a more accurate depth map for curly and wavy hair types (The Verge). Apple started this process a few years ago when they developed the iPhone X in 2017 (Engadget).

It’s not just the quality of photography that needs to be changed. We must also consider bias in the way that AI analyzes images. Twitter’s “saliency algorithm” has come under fire for racial bias in their preview crops of photos. Twitter automatically favors white faces in preview crops, no matter which image was posted first to the site. Twitter is currently planning to remove the algorithmic cropping from the site entirely in response (BBC).


This is not the first time that the company has simply removed an AI’s ability to recognize an image instead of redeveloping the AI to be more inclusive. In 2015, it was pointed out that Google Photos was labeling Black individuals as “gorillas.” Instead of fixing the AI, the company simply removed gorillas from their recognition software. In 2018 Wired followed up by testing photos of animals and although Google Photos could reliably identify multiple types of animals, there were simply no search results for “gorillas,” “chimps,” “chimpanzees,” and “monkeys” (Wired). Less than 1% of Google’s technical workforce is Black (NBC News).

Since photography is almost exclusively digital at this point, hopefully companies will take more initiative to better develop cameras that adequately capture people of color in a flattering way. We also need to adopt inclusive AI practices to ensure everyone's treated equally in social media. When we are seeking to develop inclusive tech, people of color need to have a seat at the table to help ensure that both the software and hardware we use are not racially biased.


Key Takeaways


  • Since film photography was developed, cameras have historically favored white individuals.

  • Currently, tech companies are working to develop more inclusive cameras after criticism from people of color.

  • The way we consume photography is also biased by the way algorithms and AI show us photographs through social media.


RELATED ISSUES



PLEDGE YOUR SUPPORT


Thank you for all your financial contributions! If you haven't already, consider making a monthly donation to this work. These funds will help me operationalize this work for greatest impact.

Subscribe on Patreon Give one-time on PayPal | Venmo @nicoleacardoza

Read More
Nicole Cardoza Nicole Cardoza Nicole Cardoza Nicole Cardoza

Do not circulate graphic videos of police brutality.

But race is a social construct, and social constructs have social histories. Our The first place generating criticism is in financial commitments. Companies in the U.S. pledged a collective $50 billion to various racial initiatives (Financial Times), an unprecedented response to social issues (Washington Post). But, research indicates that only $250 million has actually been spent or committed to a specific initiative (Financial Times). William Cunningham, the chief executive of Creative Investment Research, who published the study, notes that until those funds are actually spent, there’s no reason they couldn’t be retracted or allocated to another initiative. Another survey found that tech companies that made commitments have 20% fewer Black employees on average than those that didn’t (Bloomberg), adding more skepticism to some organizations’ intentions.

​Happy Tuesday and welcome back to the newsletter. This was never meant to be a newsletter that lasted a year. I was in a lot of pain when I started this project. It felt like the whole world woke up to the violence that white supremacy has wielded for generations. I pledged to send one email each day because we know that this work isn't solved with one action. This world won't change from reactions, but a collective, persistent investment today, tomorrow, and the day after, too. Today is a reflection on the ongoing conversations on the role of graphic videos in public discourse.


Thank you to everyone that makes this work possible, particularly the ones that have been here since the start. We won't stop until we're all free. If you want to support, give $7/month on Patreon. Or you can give one-time on our website or PayPal. You can also support us by joining our curated digital community.

Nicole


TAKE ACTION


  • Join the George Floyd Remembrance Virtual Day of Action by participating in any of the action items listed on the campaign page.

  • Donate to Yes 4 Minneapolis, a Black-led campaign to replace the Minneapolis Police Department with a new Department of Public Safety.

  • Use the resources created by the Witness Media Lab when considering posting or sharing videos of police brutality.

  • If you feel resourced, reflect on what you’ve learned – and unlearned – about the fight for racial equity over the past year. How can you continue to advocate for change? Use this article for guidance on identifying your role in your community.


GET EDUCATED


By Nicole Cardoza (she/her)

One year ago, George Floyd was murdered by a member of the Minneapolis Police Department. A video of his death, recorded by then 17-year-old Darnella Frazier, generated tens of millions of views in the following days and has been shared repeatedly across this past year, reigniting the everlasting fight for racial equity (NowThis). It’s undeniable that the video, and the conversation that sparked from it, fundamentally shifted society. It’s also re-ignited discussions on the role of violent videos available for public consumption, particularly regarding their impact on communities of color.


Note: this is different than advocating for the release of bodycam videos to the public. Bodycam footage, designed to hold police officers accountable while on the job, should never be withheld from a victim’s family and community.

But violence against Black people has also been used as a commodity, bartered and sold throughout time. I can’t help but think about how, just decades ago, lynchings were treated as a public attraction. Crowds would gather to partake in festivities surrounding the unjust killing, posing for photographs and taking home pieces of the person’s corpse as “souvenirs.” Postcards would be created and distributed as lasting memories. Learn more in a previous newsletter. Videos taken by police bodycams and shared widely have a similar feeling; digital souvenirs of violence protected by social and political norms.

But user-generated videos, like the one recorded by Frazier, have a different intent. Although still difficult to watch, they’re the recordings of what an everyday person was forced to bear witness to, individuals rendered helpless in the face of violence. Recording a conflict can be a form of bystander intervention when other options are limited. And social movements across time have been sparked by marginalized communities leveraging whatever channel they can to ensure their voices are heard. In this case, user-generated videos are journalism, a testament to the stories that define generations.

Author and professor Allissa Richardson, who advocates for citizen journalism and encourages everyone to consider their role in documenting the world around them, refers to it as sousveillance. This is the opposite of surveillance, created by body cameras, security cameras, and other public, often state-sanctioned forms of recordings. Sousveillance is people capturing stories with their own devices (usually smartphones) that will likely counter or disprove the facts presented by those with more power and privilege (Nieman Lab).


It’s no surprise that journalism leaders are calling for Darnella Frazier to receive the Pulitzer Prizes in Public Service for the video she recorded that changed the world, undoubtedly exemplifying content that “roots out corruption and contributes to the public good” (Nieman Lab).


Regardless of their intention, though, these assets need to be shared with sensitivity, as they exacerbate the trauma that people of color experience regularly. A study found that 20% of Black people who watch a video are “significantly affected” by it, experiencing lasting effects, including stress, anxiety, post-traumatic stress disorders, or vicarious PTSD (Yahoo). These only elevate the race-based trauma that people of color experience in their daily lives (PBS). In an article written by Arionne Nettles, Alfiee Breland-Noble, the founder and director of mental health organization AAKOMA Project, notes how Black adolescents deal with vicarious trauma from watching the videos (ZORA).

Instead, cellphone videos of vigilante violence and fatal police encounters should be viewed like lynching photographs — with solemn reserve and careful circulation.


Allissa Richardson, assistant professor at the University of Southern California’s Annenberg School for Communications and Journalism, 2021 fellow at the Berkman Klein Center for Internet and Society at Harvard University, and author of Bearing Witness While Black: African Americans, Smartphones, and the New Protest Journalism, for Nieman Labs.

Leon Ford, who was shot and paralyzed by a police officer during a traffic stop in 2012, also urges us to consider the individuals and families of the victims. “These people have children. These people have cousins, aunts, uncles, grandparents, who can’t live a normal life...even though I don't watch those videos, I can feel that energy. When I see somebody posting, I scroll past it. It still sticks to me” (Yahoo).

Some will argue that it’s necessary to share because we will never be able to fight for justice without them. But what does it say about us that justice can only be pursued for the most atrocious cases, and only if they were captured on video and circulated broadly enough to create public outcry? Why is justice only justified when the crime is warranted worthy of national attention? Most urgently, when will we take action not to share, but change the social conditions to ensure that these instances never happen again?

That will take us changing our behavior. We must channel immediate outrage into a persistent commitment to long-term change. Media platforms are taking note; more have chosen not to post the videos on their social media feeds and create multiple news articles highlighting the event – one including the video footage, one without. And as individuals, we can do the same. Instead of sharing to elicit strong emotions like shock or disgust, consider sharing the information sans video. More importantly, we recommend sharing proactive ways your community can address policing and public safety issues, like upcoming city council meetings or alternatives to calling the police. It’s action – not awareness – that will prevent these videos in the future.



Key Takeaways


  • The U.S. has a long history of distributing assets depicting violence on marginalized bodies

  • The circulation of police violence videos often exacerbate stress, anxiety, vicarious trauma and PTSD in the Black community

  • We need to evolve beyond sharing for shock and awe and take action for solidarity


RELATED ISSUES



PLEDGE YOUR SUPPORT


Thank you for all your financial contributions! If you haven't already, consider making a monthly donation to this work. These funds will help me operationalize this work for greatest impact.

Subscribe on Patreon Give one-time on PayPal | Venmo @nicoleacardoza

Read More
Nicole Cardoza Nicole Cardoza Nicole Cardoza Nicole Cardoza

Don't do digital blackface.

Get daily actions in your inbox. Subscribe Now ›

Happy Wedthursfrimonday? Hard to tell these days.

In an interview regarding his books “Home and Exile” and “Things Fall Apart, Chinua Achebe stated that “
the whole idea of a stereotype is to simplify.” But what happens when stereotypes become imitated, replicated, scaled and normalized because of social media? Today’s newsletter analyzes our nation’s history with blackface, blackface in the digital age, and who benefits from Black culture. It’s part of our ongoing series on cultural appropriation – catch up in our archives.

Thank you to all that give to support this newsletter. Please consider giving one-time on our 
websitePayPal or Venmo (@nicoleacardoza). Or, pledge $5/month on Patreon. Your funds help pay staff and ensure this newsletter remains free.

Nicole

Share | Tweet | Forward


TAKE ACTION


Research to find a specific example of a racist stereotype shared through digital blackface – like a gif, TikTok or other form of content. Consider the following:

What harmful stereotype(s) does this content perpetuate?
What was the intention of the person who shared this content?
What is the impact of the person who shared this content?


GET EDUCATED


By Nicole Cardoza

What's the deal with TikTok?

TikTok, a social media app with 100 million users in the U.S., has caught Trump’s ire. The app, known for highly-sharable short videos, often created based on viral themes, was created by ByteDance, a Chinese-based media company. Trump argues that the Chinese government could compel the company to share American users’ data or use the platform as a form of propaganda to worsen U.S. and China relations (Wired). Although it seems this is more a PR tactic than a national security threat, millions were outraged that they may lose their social media platform. Unsurprisingly, Instagram took this time to launch their TikTok competitor, Reels, which adds a copycat TikTok experience. This news also aligned with new criticisms of the app’s role in perpetuating digital blackface.

To understand digital blackface, we must start with understanding the history of blackface. Minstrel shows gained popularity in the 1830s in New York, where white performers with blackened faces (most used burnt cork or shoe polish) would don tattered clothing and imitate enslaved Black people. These performances characterized Black people as lazy, ignorant, superstitious, and hypersexual. They would intentionally make them hard to understand and prone to thievery or cowardice (NMAAHC). From the late 19th century and into the early 20th, these shows gained national popularity, moving with ease from stages to radio shows (NYTimes). Popular American actors like Judy Garland, Mickey Rooney, and Shirley Temple brought these caricatures to the big screen. And this imagery extended beyond performances to marketing anything “from tobacco to molasses to breakfast cereal” (NYTimes).

And these weren’t merely comical performances. These shows helped to build a national consensus of the role of slavery and discrimination against Black people. These tortured depictions “embodied the assertion that blackness was grotesque in itself because it could never achieve the mythical ideal of whiteness” (NYTimes). Consider that the first popularly known blackface character was named “Jim Crow” and depicted “a clumsy, dimwitted black slave.” The name became a common slur against Black people, so it was used to refer to the anti-Black laws implemented after the Reconstruction period (History). 

Also, consider that it took until 2020 for Aunt Jemima to change their branding based on these stereotypes (CNN), and Gucci thought that this turtleneck was appropriate in just 2019 (NPR). There’s a comprehensive list of public figures that have used Blackface (CNN). We are still watching blackface unfold in real-time. 

Unpacking Digital Blackface

The term “digital blackface” is a bit different. Coined by Joshua Lumpkin Green in 2016, digital blackface describes how technology enables non-Black people to appropriate Black culture and adopt Black personas (Wired). This trend is particularly relevant on social media, where likes and views reign supreme, so anything goes. Blanketed by the relative comfort of anonymity, anyone can leverage Black language and culture without claims to the experiences or identities that create the community.

We’re watching this unfold on TikTok in real-time. Jaliah Harmon, a 14-year-old who loves dancing, created an intricately choreographed dance to the song “Lottery” by the Atlanta rapper K-Camp and uploaded it to Instagram. The dance, called the Renegade, quickly got to TikTok, where it went viral. But Charli D’Amelio, a white TikTok dancer with the most followers on TikTok at the time, is considered its CEO because she, like many others, copied it without crediting its source (NYTimes).  Jaliah is only now seeing her due, but Charli charges an estimated $100,000 per sponsored post (Cosmopolitan), launched her own nail polish line dance and has been in a series of high profile partnerships, like dancing with Jennifer Lopez and appearing on “The Tonight Show” with Jimmy Fallon (Variety).


TikTok is designed for ideas to be shared and remixed, so what happened with Renegade isn’t surprising, but disappointing in a world that often undervalues Black women. But this same model fuels deeper harm against Black people, allowing, for example, white people to create videos lip-synching the words of Black people to exaggerate them, or imitating racial stereotypes – both of which sound more relevant to the 1830’s than the 2020s (Wired). As this comprehensive Wired article notes, TikTok users likely aren’t always doing it to be racist, but simply for the virality, clout, and followers. Nevertheless, disparaging posts on slavery, perpetuating police brutality against Black people, and other terrible stereotypes aren’t just posted, but encouraged, because of the algorithm.

:Virality often occurs through shocking behavior. Whether it's acting provocatively, bullying, or using racial slurs and stereotypes, a lot of users see that their questionable behavior gets a reaction, and that just encourages them”.

Morgan Eckroth, barista and TikTok user, in Wired

Although TikTok’s algorithm fuels this trend, digital blackface isn’t new. Vine, a similar social media platform that enabled users to create and share 10-second videos launched by Twitter, had several racist trends and challenges go viral on their platform, sparking accusations of blackface as early as 2013 (Metro). In 2016, Snapchat released a Bob Marley filter on 4/20 that literally gave users digital blackface and dreadlocks, which is both racially insensitive and minimizes the life and legacy artist (Wired). And AAVE (explained in a previous newsletter) is used so frequently across social media platforms that a TikTok user declared it simply “internet culture” (Daily Dot).


Digital Blackface and Gifs
 

Digital blackface manifests in other ways online. A common way is how many people use gifs of Black people and Black culture to express themselves, despite not being Black themselves. Certainly, we can all love a scene from a movie that just happens to feature a Black actor or feel that a kid’s facial expression suits how we feel right now, regardless of the kid’s race. But as Lauren Michele Jackson, the author of White Negro, explains in this brilliant Teen Vogue article, the gifs of Black people shared tend to depict overexaggerated expressions of emotions. And our society often associates Black people with being excessive. Consider the trope of the “angry Black woman,” the “angry Black man,” or the “aggressive Black boy”. These caricatures have been perpetuated in the media throughout history and used to justify condemnation, subjugation, and violence.  See Serena WilliamsChristian Cooper, and Michael Brown for specific examples.   

“Digital blackface in GIFs helps reinforce an insidious dehumanization of Black people by adding a visual component to the concept of the single story”.

Naomi Day, Speculative fiction and Afrofuturist writer, on Medium

Beyond digital blackface, there are more common ways people can use Black culture and imagery fo their gain. They may seem innocuous but are just as harmful. Consider how, after the protests, brands started using more photos of Black people on their social media feed, despite not addressing internal culture or practices that contribute to their oppression. Although they’re not directly adopting a Black culture or persona, they are trying to align themselves with a community that they haven’t earned the right to represent


What do we do about it?
 

This isn’t to say that an individual sharing their favorite gif or jumping into a TikTok trend is inherently racist. It’s the system that these actions are couched in. As we’ve explained in other posts regarding cultural appropriation, Black people experience significant discrimination and harm for expressing their culture – while white people are celebrated and compensated for it. I’m not taking away your favorite gifs for the sake of doing right. It’s another opportunity to keep doing the work. As you speak in a cultural language that’s not your own, consider what it says for the people who speak that culture fluently.

And social media platforms have a responsibility to protect their users from harm. The worst part of the TikTok story is how Black creators on TikTok are regularly experiencing racism harassment and censorship on the platform, especially when speaking up against these issues (Wired). And despite their public announcements, the company still hasn’t taken sufficient action to protect and center the needs of the Black community, despite naming Black people as “the most inspiring, creative voices on our platform” (Wired). The safety and security of TikTok is in question, but not for the dangerous space it’s created for communities of color on the app. And as other companies rush to acquire or compete with TikTok, I hope they make mitigating digital blackface a priority.


KEY TAKEAWAYS


  • Digital blackface describes how technology enables non-Black people to appropriate Black culture and adopt Black personas

  • Blackface has deep roots in the founding of America, and was used to normalize racist stereotypes against enslaved African people

  • The TikTok algorithm exacerbates digital blackface while exposing its Black community to harm

  • We need tech to take responsibility for digital blackface on their platforms, and hold ourselves accountable for our own actions


RELATED ISSUES



PLEDGE YOUR SUPPORT


Thank you for all your financial contributions! If you haven't already, consider making a monthly donation to this work. These funds will help me operationalize this work for greatest impact.

Subscribe on Patreon Give one-time on PayPal | Venmo @nicoleacardoza

Read More