LGBT creators are suing YouTube and why it is important for the future of digital rights

Gargi Sharma
5 min readSep 1, 2019

--

A group of LGBT creators have filed a lawsuit against YouTube and its parent company, Google.

They argue that:

1. Their content is being restricted simply because they are queer or discuss queer themes, and as a result they lose out on potential subscribers and ad money; and/while

2. Homophobes on YouTube are allowed to spew hate, their content is promoted on videos by queer creators, and they are allowed to make money off of their hatred.

Two women wear rainbow flags on Congress Street in Portland, Maine during the annual Pride parade
Photo by Mercedes Mehling on Unsplash

YouTube’s content policies were the subject of controversy earlier this year when Vox’s Carlos Maza tweeted how he was targeted by Crowder, a homophobic YouTuber, who used racist and homophobic slurs against Maza in his videos. His subscribers then made hateful comments on his videos and doxxed him.

Maza did not direct his tweets at the other YouTuber, but the platform, which failed to implement its policies against harassment.

YouTube took a few days to respond and people weren’t happy about it.

Following the public disapproval of its policies and the attention received by Maza’s tweets, YouTube eventually demonetised Crowder’s channel.

He would not be receiving ad revenue for his videos, but his content could stay on the website, and he could keep uploading new videos.

Can we still call social media sites public squares?

We must remember that YouTube and other social media platforms started out by marketing themselves as public squares.

They claimed that their platforms were for people to come together and share content.

These people included queer creators talking about issues no one else would. Communities were formed on websites like Tumblr that allowed people to explore their identities online, without fear of reprisal in their offline world.

Gradually, these safe(r) spaces started excluding the voices they were built on. Tumblr’s “adult content” ban, which targets queer content more than non-queer one (and women’s content more than men’s) made it harder for people to build communities.

Twitter’s disparate application of its content policy allowed hateful comments and threats, but removed tweets where people talked about their lived experiences under heteropatriarchy.

And then comes YouTube.

Recommending Nazi apologist, homophobic, sexist videos.

Researchers at Cornell University did an extensive study on how the recommendation algorithm is basically a “radicalization pipeline”.

Photo by Szabo Viktor on Unsplash

It wouldn’t be wrong to say that YouTube has broken its promise.

It promised to be a safe space for everyone, but recent instances show that it promotes a false equivalence between the speech of hateful creators and the targets of their abuse.

The creators behind the five channels suing YouTube and Google say that YouTube used their content for financial gain at the expense and the detriment of its queer creators.

In an interview with Chips with Everything, Maza tells The Guardian that hate mongers are allowed to benefit from their hate speech on YouTube through Ad Revenue, and even if they get demonetised, they are still allowed to keep the old videos, and upload new ones to share their vile views on the internet.

As seen in his tweet above, his primary issue is not with the creators, but with the platform’s inconsistent application of its policies and how they discriminate against queer creators because of their queerness.

So what does the lawsuit mean for digital rights?

YouTube has roughly 95% of the internet’s videos, so if it restricts some of them based on opaque standards, it is essentially promoting some content over other.

The kind of content it recommends/allows shapes the future of the platform.

Because of its 95% market share, it is in a position to regulate and control most video speech on the internet.

It is under an obligation to respect the rights and interests of its creators.

The need for this respect is even more acute for people with marginalised identities. It is important to understand that without positive interventions, digital worlds can and do replicate existing inequalities in the society.

YouTube says its “policies have no notion of sexual orientation or gender identity.” I say why not?

When a company presents itself to be a platform of freedoms, it is under an obligation to ensure safeguards are built into the algorithm to protect the marginalised identities.

Lastly, there is no such thing as content neutrality. If a homophobic majority keeps reporting queer content as “unacceptable” or “shocking,” that is what the algorithm will use in its future determinations.

Photo by Sharon McCutcheon on Unsplash

What’s the solution?

Youtube must respect human rights.

I propose building on the concept of privacy by design and think of rights by design.

Principles of human dignity and non-discrimination must be fed into and be a key component of any technological development.

Respect for human rights should be the default, not an afterthought.

Technology must focus on the user at the end of the process. It must be convenient, friendly, and respectful for it most vulnerable users.

Our responses to these issues determine the digital future we will have. Let us make sure that it is inclusive and promotes conversation, instead of polarisation and hate in the name of engagement.

We must learn from this and make sure that our future includes everyone and not just the loudest or the ones with the most number of followers.

Keep in mind that when some content is privileged over the others and promoted by an algorithm, those are the ideas that gain public attention and approval. Think of the positivity we could have if the digital space was actually equal and affirming.

Have a look at the plaintiffs’ video here:

You can read their complaint here.

Next discussion: de-platforming

--

--

Gargi Sharma
Gargi Sharma

Written by Gargi Sharma

climate justice + data justice (she/her)

No responses yet