YouTube to Take Closer Look At What Videos Are Allowed On Their Site

YouTube hasn’t had the best few months, and it looks like they’re finally starting to take action to make things right. The site is having an ongoing battle with offensive content. What constitutes “offensive,” what sort of filters should apply, and how does the site handle advertising on those videos?

Google’s chief business officer, Philipp Schindler, touched on the latter recently when he penned a letter to advertisers addressing concerns that their ads may show up on videos containing offensive content.

“We have a responsibility to protect this vibrant, creative world–from emerging creators to established publishers–even when we don’t always agree with the views being expressed,” the letter read, “But we also have a responsibility to our advertisers who help these publishers and creators thrive.” The letter acknowledged that YouTube has policies in place to define where Google ads appear, but “at times [they] don’t get it right.”

“Recently, we had a number of cases where brands’ ads appeared on content that was not aligned with their values,” Schindler said. “For this, we deeply apologize.”

Several dozen major companies have threatened to stop advertising on the site after The Guardian pulled their ads due to appearances on white nationalist content.

YouTube wants to make sure advertisers feel comfortable placing ads on the site, so they’re taking steps to analyze not just what videos can be monetized, but what videos are allowed on the site period.

“So starting today, we’re taking a tougher stance on hateful, offensive and derogatory content,” Schindler wrote in his letter. “This includes removing ads more effectively from content that is attacking or harassing people based on their race, religion, gender or similar categories. This change will enable us to take action, where appropriate, on a larger set of ads and sites.”

People have been calling on YouTube to take steps to control their content for some time now, but it looks like it took taking a possible financial hit finally spurred action.

It’s not really clear just how YouTube plans to judge what content they deem not appropriate, but it seems like fairly strict guidelines will be in place. Create content that attacks people, and it looks like YouTube may not give you a platform.

Be the first to comment on "YouTube to Take Closer Look At What Videos Are Allowed On Their Site"

Leave a comment

Your email address will not be published.


*