Social media sites shouldn't publish ads promoting sexist propaganda

Wmc Fbomb Computer Social Media John Schnobrich Unsplash 12919

“No one is going to hand us the agenda we voted for.” Donald Trump said, staring directly at me. “We must fight for it.”

It was the evening before the midterm elections, and I watched the political ad play on my laptop. “They have these witch hunts, they have all nonsense,” Trump continued. “It’s a hoax.” The words “witch hunts,” “nonsense,” and “hoax” all appeared in bold, red letters across the bottom of the screen as he said each word.

I didn’t watch this ad voluntarily. In fact, I was doing homework. My U.S. history teacher had assigned my class the task of watching videos on YouTube about industrialization in the United States. Before I could view my homework assignment, however, YouTube forced me to watch at least five seconds of this Trump ad before I could skip the rest, which I did.

This wasn’t the first time something like this had happened. Weeks earlier, I went on YouTube to watch a review video about early economic systems in the United States while studying for a test. The video was preceded by a cleanly animated ad produced by “Prager University.” The ad claimed taxation is theft. It was immediately followed by an, produced by the same ”university,” that tried to disprove the wage gap.

I took a break from studying to do my own research. A quick Google search revealed that “Prager University” is not really a university at all, but an organization that produces five-minute-long animated videos meant to “explain” political, economic, and social concepts from an extremist, conservative point of view. Between its website and YouTube page, Prager University has published hundreds of videos denouncing everything from immigration and gun control to feminism, Planned Parenthood, and nontraditional gender identities and gender roles.

I clicked around and found a video in which a middle-aged white man says that feminism is a “mean-spirited, small-minded, oppressive philosophy,” that supports the idea that men must be suppressed, and then endorses the cult of domesticity, which, according this man, was brought to an end due to industrial advancements that “deprived” women of their traditional places in the home. Another video uses incorrectly framed statistics and faulty reasoning to attempt to convince the viewer that Planned Parenthood is nothing but evil. The idea that people my age can be so easily and accidentally exposed to such lies is horrifying.

Young people spend a lot of time online. According to the Pew Research Center, 89 percent of teens reported that they went online at least several times a day in 2018. Forty-five percent said they are online nearly constantly. Teens particularly spend a lot of time on social media sites like YouTube; in fact, 85 percent of U.S. teens say they use YouTube. What we see online shapes what we believe and what we consider normal. If we see damaging, anti-progressive messages every day, then these ideas become familiar and even appealing, creating a “familiarity bias.” And, of course, these videos are designed to be easy to watch, understand, and share: They’re bite-sized and visually appealing. If we see our peers or other people in our networks we trust share these messages, we may believe these views are more widespread than they actually are, and this perceived popularity may further normalize these extremist ideas.

Of course, this is a problem for more than just young people’s ideas, but for our democracy at large. It’s well established that social media sites helped spread “fake news” — which was often based in extremist, biased views — leading up to the 2016 election. This extremism, in turn, sparked the spread of racist, sexist, and otherwise dangerous ideas, which we’ve seen lead to violence beyond the Internet. For example, it’s hard to believe there’s no connection between the rise in popularity of white-nationalist, conspiracy theorist YouTube channels and the rise in acts of white supremacy.

There has been some action on social media companies’ parts to address this phenomenon, but much is reactive rather than proactive. For example, In March of 2018, Prager University filed a lawsuit against Google (YouTube is a subsidiary of Google) after YouTube listed some of the “university’s” videos — including videos that made gross generalizations about Muslims and Middle Eastern countries, defended e-cigarette use and gun ownership, and criticized American colleges’ progressive mindsets — under “restricted mode.” This mode is an optional feature that parents can enable to filter out sensitive content from their children’s screens. The case was dismissed with the ruling that YouTube is a private entity and is thus able to make its own decisions about what is considered appropriate for younger audiences.

Considering that dangerous ideas have already become rampant in our country, we have to be especially conscious of and actively work against the influence of media that spreads fallacies and hate speech. We can’t individually look away when a far-right post or ad pops up, nor can we try to completely shut down platforms like YouTube. But we can ask these sites to act from within. They can and should do much more to alert their users about the reality of this propaganda and should try to prevent its creation by requiring all ads to come from verified accounts. They can also more prominently identify who has paid for the ad viewers are forced to watch.

An even more crucial part of the solution, however, might come from educating our generation about why media literacy is so important. Being aware of today’s propaganda is just as important as learning about U.S. history — particularly since the two now undeniably exist hand-in-hand.

More articles by Category: Feminism
More articles by Tag: Alt-Right, High school, Internet governance, Sexism



Amelia Burns
Sign up for our Newsletter

Learn more about topics like these by signing up for Women’s Media Center’s newsletter.