Home redpill Hollywood Wants Your Money…and Your Mind

Hollywood Wants Your Money…and Your Mind

330
0
SHARE

Imagine a group of activists so powerful that they could beam their propaganda directly into your brain. Now also imagine that they’re so sophisticated, they actually get you to pay them to do it.

Unfortunately, you don’t have to imagine it. It’s real. It’s Hollywood.

As big as the internet has become, Hollywood—and here, I’m talking specifically about television—is still king. Not only does it reach hundreds of millions of people with its messaging, it embeds that messaging in seemingly innocuous stories—stories that distract us from the hardships of daily life; stories that make us feel good, compassionate, and decent.

We watch TV, in other words, because we like it. And just as Americans didn’t think much about the carcinogens in the cigarettes they smoked for decades, most Americans don’t think much about the toxic politics in the television they watch.

But those who create that content do. They spin out hour after hour of slickly-produced left-wing propaganda and give themselves awards for doing it. They applaud each other’s “courage,” even though all their friends think exactly as they do.

I spoke with nearly a hundred members of the Hollywood community when I wrote my book, Primetime Propaganda, and many of them openly admitted they inserted “social justice” messages into their shows.

How they do it is both clever and effective.

Hollywood writers, producers, directors, actors create characters we keep wanting to spend time with, then have those characters act in ways most of us would judge wrong. Then, in effect, they ask us a question: Isn’t it really okay that Rachel from Friends decided to have a baby without first marrying Ross? After all, you like Ross and you like Rachel! How can what they do be bad?

It hasn’t always been this way. For decades, Hollywood promoted traditional American values. That changed, as did so much else in the late 1960s and ‘70s, when Hollywood stopped celebrating American values and started transforming them.

For example, in the early 1970s, abortion was a hotly-contested issue. A year before the Roe v. Wade Supreme Court case, the top-rated TV sitcom, Maude, featured a storyline in which the title character of the show has an abortion. The LA Times described it as “a watershed moment” in TV history. Why? Well, because it removed the stigma of abortion. Millions of Americans, sitting in their living rooms, saw a beloved character do something they did not approve of—and felt sympathy.

Something similar happened in the early 2000s. Vice President Joe Biden was right when he said that Will & Grace had a major impact on how Americans think about same-sex marriage. Before the hit NBC show, though most Americans had a live-and-let-live attitude toward private sexual behavior, few supported the idea of men marrying men or women marrying women. But seeing the charming and funny Will Truman live his life week after week paved the way for a much wider acceptance of same-sex marriage.

Current shows like Orange Is the New Black and Transparent are trying to affect the same change on the issue of transgenderism.

You may think these are all good things. Or that some are, and some aren’t. That’s not my point. My point is that Hollywood has had a tremendous influence on our culture, and that influence has been all to the left side of the political spectrum.